The Transformative Impact of Data and Communication on Governance


Steven Livingston at Brookings: “How do digital technologies affect governance in areas of limited statehood – places and circumstances characterized by the absence of state provisioning of public goods and the enforcement of binding rules with a monopoly of legitimate force?  In the first post in this series I introduced the limited statehood concept and then described the tremendous growth in mobile telephony, GIS, and other technologies in the developing world.   In the second post I offered examples of the use of ICT in initiatives intended to fill at least some of the governance vacuum created by limited statehood.  With mobile phones, for example, farmers are informed of market conditions, have access to liquidity through M-Pesa and similar mobile money platforms….
This brings to mind another type of ICT governance initiative.  Rather than fill in for or even displace the state some ICT initiatives can strengthen governance capacity.  Digital government – the use of digital technology by the state itself — is one important possibility.  Other initiatives strengthen the state by exerting pressure. Countries with weak governance sometimes take the form of extractive states or those, which cater to the needs of an elite, leaving the majority of the population in poverty and without basic public services. This is what Daron Acemoglu and James A. Robinson call extractive political and economic institutions.  Inclusive states, on the other hand, are pluralistic, bound by the rule of law, respectful of property rights, and, in general, accountable.  Accountability mechanisms such as a free press and competitive multiparty elections are instrumental to discourage extractive institutions.  What ICT-based initiatives might lend a hand in strengthening accountability? We can point to three examples.

Example One: Using ICT to Protect Human Rights

Nonstate actors now use commercial, high-resolution remote sensing satellites to monitor weapons programs and human rights violations.  Amnesty International’s Remote Sensing for Human Rights offers one example, and Satellite Sentinel offers another.  Both use imagery from DigitalGlobe, an American remote sensing and geospatial content company.   Other organizations have used commercially available remote sensing imagery to monitor weapons proliferation.  The Institute for Science and International Security, a Washington-based NGO, revealed the Iranian nuclear weapons program in 2003 using commercial satellite imagery…

Example Two: Crowdsourcing Election Observation

Others have used mobile phones and GIS to crowdsource election observation.  For the 2011 elections in Nigeria, The Community Life Project, a civil society organization, created ReclaimNaija, an elections process monitoring system that relied on GIS and amateur observers with mobile phones to monitor the elections.  Each of the red dots represents an aggregation of geo-located incidents reported to the ReclaimNaija platform.  In a live map, clicking on a dot disaggregates the reports, eventually taking the reader to individual reports.  Rigorous statistical analysis of ReclaimNaija results and the elections suggest it contributed to the effectiveness of the election process.

ReclaimNaija: Election Incident Reporting System Map

ReclaimNaija: Election Incident Reporting System Map

Example Three: Using Genetic Analysis to Identify War Crimes

In recent years, more powerful computers have led to major breakthroughs in biomedical science.  The reduction in cost of analyzing the human genome has actually outpaced Moore’s Law.  This has opened up new possibilities for the use of genetic analysis in forensic anthropology.   In Guatemala, the Balkans, Argentina, Peru and in several other places where mass executions and genocides took place, forensic anthropologists are using genetic analysis to find evidence that is used to hold the killers – often state actors – accountable…”

Historic release of data delivers unprecedented transparency on the medical services physicians provide and how much they are paid


Jonathan Blum, Principal Deputy Administrator, Centers for Medicare & Medicaid Services : “Today the Centers for Medicare & Medicaid Services (CMS) took a major step forward in making Medicare data more transparent and accessible, while maintaining the privacy of beneficiaries, by announcing the release of new data on medical services and procedures furnished to Medicare fee-for-service beneficiaries by physicians and other healthcare professionals (http://www.cms.gov/newsroom/newsroom-center.html). For too long, the only information on physicians readily available to consumers was physician name, address and phone number. This data will, for the first time, provide a better picture of how physicians practice in the Medicare program.
This new data set includes over nine million rows of data on more than 880,000 physicians and other healthcare professionals in all 50 states, DC and Puerto Rico providing care to Medicare beneficiaries in 2012. The data set presents key information on the provision of services by physicians and how much they are paid for those services, and is organized by provider (National Provider Identifier or NPI), type of service (Healthcare Common Procedure Coding System, or HCPCS) code, and whether the service was performed in a facility or office setting. This public data set includes the number of services, average submitted charges, average allowed amount, average Medicare payment, and a count of unique beneficiaries treated. CMS takes beneficiary privacy very seriously and we will protect patient-identifiable information by redacting any data in cases where it includes fewer than 11 beneficiaries.
Previously, CMS could not release this information due to a permanent injunction issued by a court in 1979. However, in May 2013, the court vacated this injunction, causing a series of events that has led CMS to be able to make this information available for the first time.
Data to Fuel Research and Innovation
In addition to the public data release, CMS is making slight modifications to the process to request CMS data for research purposes. This will allow researchers to conduct important research at the physician level. As with the public release of information described above, CMS will continue to prohibit the release of patient-identifiable information. For more information about CMS’s disclosures to researchers, please contact the Research Data Assistance Center (ResDAC) at http://www.resdac.org/.
Unprecedented Data Access
This data release follows other CMS efforts to make more data available to the public. Since 2010, the agency has released an unprecedented amount of aggregated data in machine-readable form, with much of it available at http://www.healthdata.gov. These data range from previously unpublished statistics on Medicare spending, utilization, and quality at the state, hospital referral region, and county level, to detailed information on the quality performance of hospitals, nursing homes, and other providers.
In May 2013, CMS released information on the average charges for the 100 most common inpatient services at more than 3,000 hospitals nationwide http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Medicare-Provider-Charge-Data/Inpatient.html.
In June 2013, CMS released average charges for 30 selected outpatient procedures http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Medicare-Provider-Charge-Data/Outpatient.html.
We will continue to work toward harnessing the power of data to promote quality and value, and improve the health of our seniors and persons with disabilities.”

AU: Revitalising and revising the Innovation Showcase


From the Public Sector Innovation Toolkit unit of the Australian Government: “Do you have any case studies of innovative initiatives in the public service?
An important part of the public sector innovation agenda is sharing examples of innovation in practice. That’s why we created the Public Sector Innovation Showcase.
As noted in the APS Innovation Action Plan, “The Public Sector Innovation Showcase will enable government agencies and departments to share and celebrate case studies of innovation, and to consider how they might apply such innovative practices within their own operations to achieve better outcomes.”
The Showcase was a joint initiative with the Department of Finance and has been operating for a number of years now. We thought it time for some changes and that it needs some new examples.
To make the showcase more useful we have incorporated it into this site – you can see the examples here. We are eager to receive more examples from the public sector – from the Commonwealth, state, territory and local governments.
Please get in contact with us if you have an example that might be suitable as a case study of innovation in the public sector. The sort of things we’re after in the case studies are spelled out in our Showcase submission guidance.
We’re seeking examples that demonstrate doing things differently, rather than doing what we do now but slightly better.

Book Review: 'The Rule of Nobody' by Philip K. Howard


Stuart Taylor Jr in the Wall Street Journal: “Amid the liberal-conservative ideological clash that paralyzes our government, it’s always refreshing to encounter the views of Philip K. Howard, whose ideology is common sense spiked with a sense of urgency. In “The Rule of Nobody,” Mr. Howard shows how federal, state and local laws and regulations have programmed officials of both parties to follow rules so detailed, rigid and, often, obsolete as to leave little room for human judgment. He argues passionately that we will never solve our social problems until we abandon what he calls a misguided legal philosophy of seeking to put government on regulatory autopilot. He also predicts that our legal-governmental structure is “headed toward a stall and then a frightening plummet toward insolvency and political chaos.”
Mr. Howard, a big-firm lawyer who heads the nonpartisan government-reform coalition Common Good, is no conventional deregulator. But he warns that the “cumulative complexity” of the dense rulebooks that prescribe “every nuance of how law is implemented” leaves good officials without the freedom to do what makes sense on the ground. Stripped of the authority that they should have, he adds, officials have little accountability for bad results. More broadly, he argues that the very structure of our democracy is so clogged by deep thickets of dysfunctional law that it will only get worse unless conservatives and liberals alike cast off their distrust of human discretion.
The rulebooks should be “radically simplified,” Mr. Howard says, on matters ranging from enforcing school discipline to protecting nursing-home residents, from operating safe soup kitchens to building the nation’s infrastructure: Projects now often require multi-year, 5,000-page environmental impact statements before anything can begin to be constructed. Unduly detailed rules should be replaced by general principles, he says, that take their meaning from society’s norms and values and embrace the need for official discretion and responsibility.
Mr. Howard serves up a rich menu of anecdotes, including both the small-scale activities of a neighborhood and the vast administrative structures that govern national life. After a tree fell into a stream and caused flooding during a winter storm, Franklin Township, N.J., was barred from pulling the tree out until it had spent 12 days and $12,000 for the permits and engineering work that a state environmental rule required for altering any natural condition in a “C-1 stream.” The “Volcker Rule,” designed to prevent banks from using federally insured deposits to speculate in securities, was shaped by five federal agencies and countless banking lobbyists into 963 “almost unintelligible” pages. In New York City, “disciplining a student potentially requires 66 separate steps, including several levels of potential appeals”; meanwhile, civil-service rules make it virtually impossible to terminate thousands of incompetent employees. Children’s lemonade stands in several states have been closed down for lack of a vendor’s license.

 

Conservatives as well as liberals like detailed rules—complete with tedious forms, endless studies and wasteful legal hearings—because they don’t trust each other with discretion. Corporations like them because they provide not only certainty but also “a barrier to entry for potential competitors,” by raising the cost of doing business to prohibitive levels for small businesses with fresh ideas and other new entrants to markets. Public employees like them because detailed rules “absolve them of responsibility.” And, adds Mr. Howard, “lawsuits [have] exploded in this rules-based regime,” shifting legal power to “self-interested plaintiffs’ lawyers,” who have learned that they “could sue for the moon and extract settlements even in cases (as with some asbestos claims) that were fraudulent.”
So habituated have we become to such stuff, Mr. Howard says, that government’s “self-inflicted ineptitude is accepted as a state of nature, as if spending an average of eight years on environmental reviews—which should be a national scandal—were an unavoidable mountain range.” Common-sensical laws would place outer boundaries on acceptable conduct based on reasonable norms that are “far better at preventing abuse of power than today’s regulatory minefield.”
“As Mr. Howard notes, his book is part of a centuries-old rules-versus-principles debate. The philosophers and writers whom he quotes approvingly include Aristotle, James Madison, Isaiah Berlin and Roscoe Pound, a prominent Harvard law professor and dean who condemned “mechanical jurisprudence” and championed broad official discretion. Berlin, for his part, warned against “monstrous bureaucratic machines, built in accordance with the rules that ignore the teeming variety of the living world, the untidy and asymmetrical inner lives of men, and crush them into conformity.” Mr. Howard juxtaposes today’s roughly 100 million words of federal law and regulations with Madison’s warning that laws should not be “so voluminous that they cannot be read, or so incoherent that they cannot be understood.”…

Let’s get geeks into government


Gillian Tett in the Financial Times: “Fifteen years ago, Brett Goldstein seemed to be just another tech entrepreneur. He was working as IT director of OpenTable, then a start-up website for restaurant bookings. The company was thriving – and subsequently did a very successful initial public offering. Life looked very sweet for Goldstein. But when the World Trade Center was attacked in 2001, Goldstein had a moment of epiphany. “I spent seven years working in a startup but, directly after 9/11, I knew I didn’t want my whole story to be about how I helped people make restaurant reservations. I wanted to work in public service, to give something back,” he recalls – not just by throwing cash into a charity tin, but by doing public service. So he swerved: in 2006, he attended the Chicago police academy and then worked for a year as a cop in one of the city’s toughest neighbourhoods. Later he pulled the disparate parts of his life together and used his number-crunching skills to build the first predictive data system for the Chicago police (and one of the first in any western police force), to indicate where crime was likely to break out.

This was such a success that Goldstein was asked by Rahm Emanuel, the city’s mayor, to create predictive data systems for the wider Chicago government. The fruits of this effort – which include a website known as “WindyGrid” – went live a couple of years ago, to considerable acclaim inside the techie scene.

This tale might seem unremarkable. We are all used to hearing politicians, business leaders and management consultants declare that the computing revolution is transforming our lives. And as my colleague Tim Harford pointed out in these pages last week, the idea of using big data is now wildly fashionable in the business and academic worlds….

In America when top bankers become rich, they often want to “give back” by having a second career in public service: just think of all those Wall Street financiers who have popped up at the US Treasury in recent years. But hoodie-wearing geeks do not usually do the same. Sure, there are some former techie business leaders who are indirectly helping government. Steve Case, a co-founder of AOL, has supported White House projects to boost entrepreneurship and combat joblessness. Tech entrepreneurs also make huge donations to philanthropy. Facebook’s Mark Zuckerberg, for example, has given funds to Newark education. And the whizz-kids have also occasionally been summoned by the White House in times of crisis. When there was a disastrous launch of the government’s healthcare website late last year, the Obama administration enlisted the help of some of the techies who had been involved with the president’s election campaign.

But what you do not see is many tech entrepreneurs doing what Goldstein did: deciding to spend a few years in public service, as a government employee. There aren’t many Zuckerberg types striding along the corridors of federal or local government.
. . .
It is not difficult to work out why. To most young entrepreneurs, the idea of working in a state bureaucracy sounds like utter hell. But if there was ever a time when it might make sense for more techies to give back by doing stints of public service, that moment is now. The civilian public sector badly needs savvier tech skills (just look at the disaster of that healthcare website for evidence of this). And as the sector’s founders become wealthier and more powerful, they need to show that they remain connected to society as a whole. It would be smart political sense.
So I applaud what Goldstein has done. I also welcome that he is now trying to persuade his peers to do the same, and that places such as the University of Chicago (where he teaches) and New York University are trying to get more young techies to think about working for government in between doing those dazzling IPOs. “It is important to see more tech entrepreneurs in public service. I am always encouraging people I know to do a ‘stint in government”. I tell them that giving back cannot just be about giving money; we need people from the tech world to actually work in government, “ Goldstein says.

But what is really needed is for more technology CEOs and leaders to get involved by actively talking about the value of public service – or even encouraging their employees to interrupt their private-sector careers with the occasional spell as a government employee (even if it is not in a sector quite as challenging as the police). Who knows? Maybe it could be Sheryl Sandberg’s next big campaigning mission. After all, if she does ever jump back to Washington, that could have a powerful demonstration effect for techie women and men. And shake DC a little too.”

Why Are Rich Countries Democratic?


Ricardo Hausmann at Project Syndicate: “When Adam Smith was 22, he famously proclaimed that, “Little else is requisite to carry a state to the highest degree of opulence from the lowest barbarism, but peace, easy taxes, and a tolerable administration of justice: all the rest being brought about by the natural course of things.” Today, almost 260 years later, we know that nothing could be further from the truth.
The disappearance of Malaysia Airlines Flight 370 shows how wrong Smith was, for it highlights the intricate interaction between modern production and the state. To make air travel feasible and safe, states ensure that pilots know how to fly and that aircraft pass stringent tests. They build airports and provide radar and satellites that can track planes, air traffic controllers to keep them apart, and security services to keep terrorists on the ground. And, when something goes wrong, it is not peace, easy taxes, and justice that are called in to assist; it is professional, well-resourced government agencies.
All advanced economies today seem to need much more than the young Smith assumed. And their governments are not only large and complex, comprising thousands of agencies that administer millions of pages of rules and regulations; they are also democratic – and not just because they hold elections every so often. Why?
By the time he published The Wealth of Nations, at age 43, Smith had become the first complexity scientist. He understood that the economy was a complex system that needed to coordinate the work of thousands of people just to make things as simple as a meal or a suit.
But Smith also understood that while the economy was too intricate to be organized by anybody, it has the capacity to self-organize. It possesses an “invisible hand,” which operates through market prices to provide an information system that can be used to calculate whether using resources for a given purpose is worthwhile – that is, profitable.
Profit is an incentive system that leads firms and individuals to respond to the information provided by prices. And capital markets are a resource-mobilization system that provides money to those companies and projects that are expected to be profitable – that is, the ones that respond adequately to market prices.
But modern production requires many inputs that markets do not provide. And, as in the case of airlines, these inputs – rules, standards, certifications, infrastructure, schools and training centers, scientific labs, security services, among others – are deeply complementary to the ones that can be procured in markets. They interact in the most intricate ways with the activities that markets organize.
So here’s the question: Who controls the provision of the publicly provided inputs? The prime minister? The legislature? Which country’s top judges have read the millions of pages of legislation or considered how they complement or contradict each other, much less applied them to the myriad different activities that comprise the economy? Even a presidential executive cannot be fully aware of the things that are done or not done by the thousands of government agencies and how they affect each part of society.
This is an information-rich problem, and, like the social-coordination challenge that the market addresses, it does not allow for centralized control. What is needed is something like the invisible hand of the market: a mechanism for self-organization. Elections clearly are not enough, because they typically occur at two- or four-year intervals and collect very little information per voter.
Instead, successful political systems have had to create an alternative invisible hand – a system that decentralizes the power to identify problems, propose solutions, and monitor performance, such that decisions are made with much more information.
To take just one example, the United States’ federal government accounts for just 537 of the country’s roughly 500,000 elected positions. Clearly, there is much more going on elsewhere.
The US Congress has 100 senators with 40 aides each, and 435 representatives with 25 aides each. They are organized into 42 committees and 182 subcommittees, meaning that there are 224 parallel conversations going on. And this group of more than 15,000 people is not alone. Facing them are some 22,000 registered lobbyists, whose mission is (among other goals) to sit down with legislators and draft legislation.
This, together with a free press, is part of the structure that reads the millions of pages of legislation and monitors what government agencies do and do not do. It generates the information and the incentives to respond to it. It affects the allocation of budgetary resources. It is an open system in which anybody can create news or find a lobbyist to make his case, whether it is to save the whales or to eat them.
Without such a mechanism, the political system cannot provide the kind of environment that modern economies need. That is why all rich countries are democracies, and it is why some countries, like my own (Venezuela), are becoming poorer. Although some of these countries do hold elections, they tend to stumble at even the simplest of coordination problems. Lining up to vote is no guarantee that citizens will not also have to line up for toilet paper.”

Open Data: What Is It and Why Should You Care?


Jason Shueh at Government Technology: “Though the debate about open data in government is an evolving one, it is indisputably here to stay — it can be heard in both houses of Congress, in state legislatures, and in city halls around the nation.
Already, 39 states and 46 localities provide data sets to data.gov, the federal government’s online open data repository. And 30 jurisdictions, including the federal government, have taken the additional step of institutionalizing their practices in formal open data policies.
Though the term “open data” is spoken of frequently — and has been since President Obama took office in 2009 — what it is and why it’s important isn’t always clear. That’s understandable, perhaps, given that open data lacks a unified definition.
“People tend to conflate it with big data,” said Emily Shaw, the national policy manager at the Sunlight Foundation, “and I think it’s useful to think about how it’s different from big data in the sense that open data is the idea that public information should be accessible to the public online.”
Shaw said the foundation, a Washington, D.C., non-profit advocacy group promoting open and transparent government, believes the term open data can be applied to a variety of information created or collected by public entities. Among the benefits of open data are improved measurement of policies, better government efficiency, deeper analytical insights, greater citizen participation, and a boost to local companies by way of products and services that use government data (think civic apps and software programs).
“The way I personally think of open data,” Shaw said, “is that it is a manifestation of the idea of open government.”

What Makes Data Open

For governments hoping to adopt open data in policy and in practice, simply making data available to the public isn’t enough to make that data useful. Open data, though straightforward in principle, requires a specific approach based on the agency or organization releasing it, the kind of data being released and, perhaps most importantly, its targeted audience.
According to the foundation’s California Open Data Handbook, published in collaboration with Stewards of Change Institute, a national group supporting innovation in human services, data must first be both “technically open” and “legally open.” The guide defines the terms in this way:
Technically open: [data] available in a machine-readable standard format, which means it can be retrieved and meaningfully processed by a computer application
Legally open: [data] explicitly licensed in a way that permits commercial and non-commercial use and re-use without restrictions.
Technically open means that data is easily accessible to its intended audience. If the intended users are developers and programmers, Shaw said, the data should be presented within an application programming interface (API); if it’s intended for researchers in academia, data might be structured in a bulk download; and if it’s aimed at the average citizen, data should be available without requiring software purchases.
….

4 Steps to Open Data

Creating open data isn’t without its complexities. There are many tasks that need to happen before an open data project ever begins. A full endorsement from leadership is paramount. Adding the project into the work flow is another. And allaying fears and misunderstandings is expected with any government project.
After the basic table stakes are placed, the handbook prescribes four steps: choosing a set of data, attaching an open license, making it available through a proper format and ensuring the data is discoverable.
1. Choose a Data Set
Choosing a data set can appear daunting, but it doesn’t have to be. Shaw said ample resources are available from the foundation and others on how to get started with this — see our list of open data resources for more information. In the case of selecting a data set, or sets, she referred to the foundation’s recently updated guidelines that urge identifying data sets based on goals and the demand from citizen feedback.
2. Attach an Open License
Open licenses dispel ambiguity and encourage use. However, they need to be proactive, and this means users should not be forced to request the information in order to use it — a common symptom of data accessed through the Freedom of Information Act. Tips for reference can be found at Opendefinition.org, a site that has a list of examples and links to open licenses that meet the definition of open use.
3. Format the Data to Your Audience
As previously stated, Shaw recommends tailoring the format of data to the audience, with the ideal being that data is packaged in formats that can be digested by all users: developers, civic hackers, department staff, researchers and citizens. This could mean it’s put into APIs, spreadsheet docs, text and zip files, FTP servers and torrent networking systems (a way to download files from different sources). The file type and the system for download all depends on the audience.
“Part of learning about what formats government should offer data in is to engage with the prospective users,” Shaw said.
4. Make it Discoverable
If open data is strewn across multiple download links and wedged into various nooks and crannies of a website, it probably won’t be found. Shaw recommends a centralized hub that acts as a one-stop shop for all open data downloads. In many jurisdictions, these Web pages and websites have been called “portals;” they are the online repositories for a jurisdiction’s open data publishing.
“It is important for thinking about how people can become aware of what their governments hold. If the government doesn’t make it easy for people to know what kinds of data is publicly available on the website, it doesn’t matter what format it’s in,” Shaw said. She pointed to public participation — a recurring theme in open data development — to incorporate into the process to improve accessibility.
 
Examples of portals, can be found in numerous cities across the U.S., such as San Francisco, New York, Los Angeles, Chicago and Sacramento, Calif.
Visit page 2 of our story for open data resources, and page 3 for open data file formats.

Open Government: Building Trust and Civic Engagement


Gavin Newsom and Zachary Bookman in the Huffington Post: “Daily life has become inseparable from new technologies. Our phones and tablets let us shop from the couch, track how many miles we run, and keep in touch with friends across town and around the world – benefits barely possible a decade ago.
With respect to our communities, Uber and Lyft now shuttle us around town, reducing street traffic and parking problems. Adopt-a-Hydrant apps coordinate efforts to dig out hydrants after snowstorms, saving firefighters time when battling blazes. Change.org, helps millions petition for and effect social and political change.
Yet as a sector, government typically embraces technology well-behind the consumer curve. This leads to disheartening stories, like veterans waiting months or years for disability claims due to outdated technology or the troubled rollout of the Healthcare.gov website. This is changing.
Cities and states are now the driving force in a national movement to harness technology to share a wealth of government information and data. Many forward thinking local governments now provide effective tools to the public to make sense of all this data.
This is the Open Government movement.
For too long, government information has been locked away in agencies, departments, and archaic IT systems. Senior administrators often have to request the data they need to do their jobs from system operators. Elected officials, in turn, often have to request data from these administrators. The public remains in the dark, and when data is released, it appears in the form of inaccessible or incomprehensible facts and figures.
Governments keep massive volumes of data, from 500 page budget documents to population statistics to neighborhood crime rates. Although raw data is a necessary component of Open Government, for it to empower citizens and officials the data must be transformed into meaningful and actionable insights. Governments must both publish information in “machine readable” format and give people the tools to understand and act on it.
New platforms can transform data from legacy systems into meaningful visualizations. Instant, web-based access to this information not only saves time and money, but also helps government make faster and better decisions. This allows them to serve their communities and builds trust with citizens.
Leading governments like Palo Alto have begun employing technology to leverage these benefits. Even the City of Bell, California, which made headlines in 2010 when senior administrators siphoned millions of dollars from the general fund, is now leveraging cloud technology to turn a new page in its history. The city has presented its financial information in an easily accessible, interactive platform at Bell.OpenGov.com. Citizens and officials alike can see vivid, user generated charts and graphs that show where money goes, what services are offered to residents, and how much those services cost.
In 2009, San Francisco became an early adopter of the open data movement when an executive order made open and machine-readable the default for our consolidated government. That simple order spurred an entirely new industry and the City of San Francisco has been adopting apps like the San Francisco Heat Vulnerability Index and Neighborhood Score ever since. The former identifies areas vulnerable to heat waves with the hope of better preparedness, while the latter provides an overall health and sustainability score, block-by-block for every neighborhood in the city. These new apps use local, state, federal, and private data sets to allow residents to see how their neighborhoods rank.
The California State Lands Commission, responsible for the stewardship of the state’s lands, waterways, and natural resources, is getting in on the Open Government movement too. The Commission now publishes five years of expense and revenue data at CAStateLands.opengov.com (which just launched today!). California residents can now see how the state generates nearly half a billion dollars in revenue from oil and gas contracts, mineral royalties, and leasing programs. The State can now communicate how it manages those resources, so that citizens understand how their government works for them.
The Open Government movement provides a framework for improved public administration and a path for more trust and engagement. Governments have been challenged to do better, and now they can.”

Potholes and Big Data: Crowdsourcing Our Way to Better Government


Phil Simon in Wired: “Big Data is transforming many industries and functions within organizations with relatively limited budgets.
Consider Thomas M. Menino, up until recently Boston’s longest-serving mayor. At some point in the past few years, Menino realized that it was no longer 1950. Perhaps he was hobnobbing with some techies from MIT at dinner one night. Whatever his motivation, he decided that there just had to be a better, more cost-effective way to maintain and fix the city’s roads. Maybe smartphones could help the city take a more proactive approach to road maintenance.
To that end, in July 2012, the Mayor’s Office of New Urban Mechanics launched a new project called Street Bump, an app that allows drivers to automatically report the road hazards to the city as soon as they hear that unfortunate “thud,” with their smartphones doing all the work.
The app’s developers say their work has already sparked interest from other cities in the U.S., Europe, Africa and elsewhere that are imagining other ways to harness the technology.
Before they even start their trip, drivers using Street Bump fire up the app, then set their smartphones either on the dashboard or in a cup holder. The app takes care of the rest, using the phone’s accelerometer — a motion detector — to sense when a bump is hit. GPS records the location, and the phone transmits it to an AWS remote server.
But that’s not the end of the story. It turned out that the first version of the app reported far too many false positives (i.e., phantom potholes). This finding no doubt gave ammunition to the many naysayers who believe that technology will never be able to do what people can and that things are just fine as they are, thank you. Street Bump 1.0 “collected lots of data but couldn’t differentiate between potholes and other bumps.” After all, your smartphone or cell phone isn’t inert; it moves in the car naturally because the car is moving. And what about the scores of people whose phones “move” because they check their messages at a stoplight?
To their credit, Menino and his motley crew weren’t entirely discouraged by this initial setback. In their gut, they knew that they were on to something. The idea and potential of the Street Bump app were worth pursuing and refining, even if the first version was a bit lacking. Plus, they have plenty of examples from which to learn. It’s not like the iPad, iPod, and iPhone haven’t evolved considerably over time.
Enter InnoCentive, a Massachusetts-based firm specializing in open innovation and crowdsourcing. The City of Boston contracted InnoCentive to improve Street Bump and reduce the amount of tail chasing. The company accepted the challenge and essentially turned it into a contest, a process sometimes called gamification. InnoCentive offered a network of 400,000 experts a share of $25,000 in prize money donated by Liberty Mutual.
Almost immediately, the ideas to improve Street Bump poured in from unexpected places. This crowd had wisdom. Ultimately, the best suggestions came from:

  • A group of hackers in Somerville, Massachusetts, that promotes community education and research
  • The head of the mathematics department at Grand Valley State University in Allendale, MI.
  • An anonymous software engineer

…Crowdsourcing roadside maintenance isn’t just cool. Increasingly, projects like Street Bump are resulting in substantial savings — and better government.”

Charities Try New Ways to Test Ideas Quickly and Polish Them Later


Ben Gose in the Chronicle of Philanthropy: “A year ago, a division of TechSoup Global began working on an app to allow donors to buy a hotel room for victims of domestic violence when no other shelter is available. Now that app is a finalist in a competition run by a foundation that combats human trafficking—and a win could mean a grant worth several hundred thousand dollars. The app’s evolution—adding a focus on sex slaves to the initial emphasis on domestic violence—was hardly accidental.
Caravan Studios, the TechSoup division that created the app, has embraced a new management approach popular in Silicon Valley known as “lean start-up.”
The principles, which are increasingly popular among nonprofits, emphasize experimentation over long-term planning and urge groups to get products and services out to clients as early as possible so the organizations can learn from feedback and make changes.
When the app, known as SafeNight, was still early in the design phase, Caravan posted details about the project on its website, including applications for grants that Caravan had not yet received. In lean-start-up lingo, Caravan put out a “minimal viable product” and hoped for feedback that would lead to a better app.
Caravan soon heard from antitrafficking organizations, which were interested in the same kind of service. Caravan eventually teamed up with the Polaris Project and the State of New Jersey, which were working on a similar app, to jointly create an app for the final round of the antitrafficking contest. Humanity United, the foundation sponsoring the contest, plans to award $1.8-million to as many as three winners later this month.
Marnie Webb, CEO of Caravan, which is building an array of apps designed to curb social problems, says lean-start-up principles help Caravan work faster and meet real needs.
“The central idea is that any product that we develop will get better if it lives as much of its life as possible outside of our office,” Ms. Webb says. “If we had kept SafeNight inside and polished it and polished it, it would have been super hard to bring on a partner because we would have invested too much.”….
Nonprofits developing new tech tools are among the biggest users of lean-start-up ideas.
Upwell, an ocean-conservation organization founded in 2011, scans the web for lively ocean-related discussions and then pushes to turn them into full-fledged movements through social-media campaigns.
Lean principles urge groups to steer clear of “vanity metrics,” such as site visits, that may sound impressive but don’t reveal much. Upwell tracks only one number—“social mentions”—the much smaller group of people who actually say something about an issue online.
After identifying a hot topic, Upwell tries to assemble a social-media strategy within 24 hours—what it calls a “minimum viable campaign.”
“We do the least amount of work to get something out the door that will get results and information,” says Rachel Dearborn, Upwell’s campaign director.
Campaigns that don’t catch on are quickly scrapped. But campaigns that do catch on get more time, energy, and money from Upwell.
After Hurricane Sandy, in 2012, a prominent writer on ocean issues and others began pushing the idea that revitalizing the oyster beds near New York City could help protect the shore from future storm surges. Upwell’s “I (Oyster) New York” campaign featured a catchy logo and led to an even bigger spike in attention.

‘Build-Measure-Learn’

Some organizations that could hardly be called start-ups are also using lean principles. GuideStar, the 20-year-old aggregator of financial information about charities, is using the lean approach to develop tools more quickly that meet the needs of its users.
The lean process promotes short “build-measure-learn” cycles, in which a group frequently updates a product or service based on what it hears from its customers.
GuideStar and the Nonprofit Finance Fund have developed a tool called Financial Scan that allows charities to see how they compare with similar groups on various financial measures, such as their mix of earned revenue and grant funds.
When it analyzed who was using the tool, GuideStar found heavy interest from both foundations and accounting firms, says Evan Paul, GuideStar’s senior director of products and marketing.
In the future, he says, GuideStar may create three versions of Financial Scan to meet the distinct interests of charities, foundations, and accountants.
“We want to get more specific about how people are using our data to make decisions so that we can help make those decisions better and faster,” Mr. Paul says….


Lean Start-Up: a Glossary of Terms for a Hot New Management Approach

Build-Measure-Learn

Instead of spending considerable time developing a product or service for a big rollout, organizations should consider using a continuous feedback loop: “build” a program or service, even if it is not fully fleshed out; “measure” how clients are affected; and “learn” by improving the program or going in a new direction. Repeat the cycle.

Minimum Viable Product

An early version of a product or service that may be lacking some features. This approach allows an organization to obtain feedback from clients and quickly determine the usefulness of a product or service and how to improve it.

Get Out of the Building

To determine whether a product or service is needed, talk to clients and share your ideas with them before investing heavily.

A/B Testing

Create two versions of a product or service, show them to different groups, and see which performs best.

Failing Fast

By quickly realizing that a product or service isn’t viable, organizations save time and money and gain valuable information for their next effort.

Pivot

Making a significant change in strategy when the early testing of a minimum viable product shows that the product or service isn’t working or isn’t needed.

Vanity Metrics

Measures that seem to provide a favorable picture but don’t accurately capture the impact of a product. An example might be a tally of website page views. A more meaningful measure—or an “actionable metric,” in the lean lexicon—might be the number of active users of an online service.
Sources: The Lean Startup, by Eric Ries; The Ultimate Dictionary of Lean for Social Good, a publication by Lean Impact”