Can NewsGenius make annotated government documents more understandable?


at E Pluribus Unum: “Last year, Rap Genius launched News Genius to help decode current events. Today, the General Service Administration (GSA) announced that digital annotation service News Genius is now available to help decode federal government Web projects:

“The federal government can now unlock the collaborative “genius” of citizens and communities to make public services easier to access and understand with a new free social media platform launched by GSA today at the Federal #SocialGov Summit on Entrepreneurship and Small Business,” writes Justin Herman, federal social media manager.

“News Genius, an annotation wiki based on Rap Genius now featuring federal-friendly Terms of Service, allows users to enhance policies, regulations and other documents with in-depth explanations, background information and paths to more resources. In the hands of government managers it will improve public services through citizen feedback and plain language, and will reduce costs by delivering these benefits on a free platform that doesn’t require a contract.”

This could be a significant improvement in making complicated policy documents and regulations understandable to the governed. While plain writing is indispensable for open government and mandated by law and regulation, the practice isn’t exactly uniformly practiced in Washington.

If people can understand more about what a given policy, proposed rule or regulation actually says, they may well be more likely to participate in the process of revising it. We’ll see if people adopt the tool, but on balance, that sounds like a step ahead.”

Randomized control trials (RCTs): interesting, but a marginal tool for governments


ODI Researcher Philipp Krause at BeyondBudgets: “Randomized control trials (RCTs) have had a great decade. The stunning line-up of speakers who celebrated J-PAL’s tenth anniversary in Boston last December gives some indication of just how great. They are the shiny new tool of development policy, and a lot of them are pretty cool. Browsing through J-PAL’s library of projects, it’s easy to see how so many of them end up in top-notch academic journals.
So far, so good. But the ambition of RCTs is not just to provide a gold-standard measurement of impact. They aim to actually have an impact on the real world themselves. The scenario goes something like this: researchers investigate the effect of an intervention and use the findings to either get out of that mess quickly (if the intervention doesn’t work) or scale it up quickly (if it does). In the pursuit of this impact-seeker’s Nirvana, it’s easy to conflate a couple of things, notably that an RCT is not the only way to evaluate impact; and evaluating impact is not the only way to use evidence for policy. Unfortunately, it is now surprisingly common to hear RCTs conflated with evidence-use, and evidence-use equated with the key ingredient for better public services in developing countries. The reality of evidence use is different.
Today’s rich countries didn’t get rich by using evidence systematically. This is a point that we recently discussed at a big World Bank – ODI conference on the (coincidental?) tenth anniversary of the WDR 2004. Lant Pritchett made it best when describing Randomistas as engaging in faith-based activity: nobody could accuse the likes of Germany, Switzerland, Sweden or the US of achieving human development by systematically scaling up what works.
What these countries do have in spades is people noisily demanding stuff, and governments giving it to them. In fact, some of the greatest innovations in providing health, unemployment benefits and pensions to poor people (and taking them to scale) happened because citizens seemed to want them, and giving them stuff seemed like a good way to shut them up. Ask Otto Bismarck. It’s not too much of a stretch to call this the history of public spending in a nutshell….
The bottom line is governments s that care about impact have plenty of cheaper, timelier and more appropriate tools and options available to them than RCTs. That doesn’t mean RCTs shouldn’t be done, of course. And the evaluation of aid is a different matter altogether, where donors are free to be as inefficient about evidence-basing as they wish without burdening poor countries.
But for governments the choice of how to go about using systematic evidence is theirs to make. And it’s a tough capability to pick up. Many governments choose not to do it, and there’s no evidence that they suffer for it. It would be wrong for donors to suggest to low-income countries that RCTs are in any way critical for their public service capability. Better call them what they are: interesting, but marginal.”

Pursuing adoption of free and open source software in governments


at O’Reilly Radar: “Reasons for government agencies to adopt free and open source software have been aired repeatedly, including my article mentioned earlier. A few justifications include:

Access
Document formats must allow citizens to read and submit documents without purchasing expensive tools.
Participation
Free software allows outside developers to comment and contribute.
Public ownership
Whatever tools are developed or purchased by the government should belong to the public, as long as no security issues are involved.
Archiving
Proprietary formats can be abandoned by their vendors after only two or three years.
Transparency
Free software allows the public to trust that the tools are accurate and have no security flaws.
Competition
The government has an interest in avoiding lock-in and ensuring that software can be maintained or replaced.
Cost
In the long run, an agency can save a lot of money by investing in programming or system administration skills, or hiring a firm to maintain the free software.

Obviously, though, government agencies haven’t gotten the memo. I’m not just talking metaphorically; there have been plenty of memos urging the use of open source, ranging from the US Department of Defense to laws passed in a number of countries.
And a lot of progress has taken place. Munich, famously, has switched its desktops to GNU/Linux and OpenOffice.org — but the process took 13 years. Elsewhere in Europe, Spain has been making strides, and the UK promises to switch. In Latin America, Brazil has made the most progress. Many countries that could benefit greatly from using free software — and have even made commitments to do so — are held back by a lack of IT staff with the expertise to do so.
Key barriers include:

Procurement processes
General consensus among knowledgeable software programmers holds that age-old rules for procurement shouldn’t be tossed out, but could be tweaked to admit bids from more small businesses that want to avoid the bureaucracy of registering with the government and answering Requests for Proposals (RFPs).
Habits of passivity
Government managers are well aware of how little they understand the software development process — in fact, if you ask them what they would need to adopt more open source software, they rarely come up with useful answers. They prefer to hand all development and maintenance to an outside firm, which takes full advantage of this to isolate agencies from one another and lock in expensive rates.
Lack of knowledgeable IT staff
The government managers have reason to keep hands off free software. One LibrePlanet audience member reported that he installed a good deal of free software at his agency, but that when he left, they could not find knowledgeable IT hires to take over. Bit by bit, the free software was replaced with proprietary products known to the new staff.
Political pressure
The urge to support proprietary companies doesn’t just come from their sales people or lobbyists. Buying software, like other products, is seen by politicians as a way of ensuring that jobs remain in their communities.
Lack of information
Free software is rarely backed by a marketing and sales organization, and even if managers have the initiative to go look for the software, they don’t know how to evaluate its maturity and readiness.

Thoroughgoing change in the area of software requires managers to have a certain consciousness at a higher level: they need to assert control over their missions and adopt agile workflows. That will inevitably spawn a desire for more control over the software that carries out these missions. A posting by Matthew Burton of the Consumer Financial Protection Bureau shows that radical redirections like this are possible.
In the meantime, here are some ideas that the panelists and audience came up with:

Tweaking procurement
If projects can be kept cheap — as Code for America does using grants and other stratagems — they don’t have to navigate the procurement process. Hackathons and challenges can also produce results — but they have a number of limitations, particularly the difficulty developers have in understanding the requirements of the people they want to serve. Some agencies can also bypass procurement by forming partnerships with community groups who produce the software. Finally, a possibly useful model is to take a cut of income from a project instead of charging the government for it.
Education
Managers have heard of open source software by now — great progress from just a few years ago — and are curious about it. On the production side, we need to help them see the benefits of releasing code, and how to monitor their software vendors to make sure the code is really usable. On the consumption side, we need to teach them maturity models and connect them to strong development projects.
Sharing
Most governments have familiar tasks that can be met by the same software base, but end up paying to reinvent (or just reinstall) the wheel. Code for America started a peer network to encourage managers to talk to one another about solutions. The Brazilian government has started a Public Software Portal. The European Union has an open source database and the US federal government has posted a list of government software released as open source.”

Sinkhole of bureaucracy


First article in a Washington Post series “examining the failures at the heart of troubled federal systems” by David A. Fahrenthold: “The trucks full of paperwork come every day, turning off a country road north of Pittsburgh and descending through a gateway into the earth. Underground, they stop at a metal door decorated with an American flag.

Behind the door, a room opens up as big as a supermarket, full of five-drawer file cabinets and people in business casual. About 230 feet below the surface, there is easy-listening music playing at somebody’s desk.
This is one of the weirdest workplaces in the U.S. government — both for where it is and for what it does.
Here, inside the caverns of an old Pennsylvania limestone mine, there are 600 employees of the Office of Personnel Management. Their task is nothing top-secret. It is to process the retirement papers of the government’s own workers.
But that system has a spectacular flaw. It still must be done entirely by hand, and almost entirely on paper.

The employees here pass thousands of case files from cavern to cavern and then key in retirees’ personal data, one line at a time. They work underground not for secrecy but for space. The old mine’s tunnels have room for more than 28,000 file cabinets of paper records.
This odd place is an example of how hard it is to get a time-wasting bug out of a big bureaucratic system.
Held up by all that paper, work in the mine runs as slowly now as it did in 1977….”
See also Data mining. The old-fashioned way: View the full graphic.

Index: Privacy and Security


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on privacy and security and was originally published in 2014.

Globally

  • Percentage of people who feel the Internet is eroding their personal privacy: 56%
  • Internet users who feel comfortable sharing personal data with an app: 37%
  • Number of users who consider it important to know when an app is gathering information about them: 70%
  • How many people in the online world use privacy tools to disguise their identity or location: 28%, or 415 million people
  • Country with the highest penetration of general anonymity tools among Internet users: Indonesia, where 42% of users surveyed use proxy servers
  • Percentage of China’s online population that disguises their online location to bypass governmental filters: 34%

In the United States

Over the Years

  • In 1996, percentage of the American public who were categorized as having “high privacy concerns”: 25%
    • Those with “Medium privacy concerns”: 59%
    • Those who were unconcerned with privacy: 16%
  • In 1998, number of computer users concerned about threats to personal privacy: 87%
  • In 2001, those who reported “medium to high” privacy concerns: 88%
  • Individuals who are unconcerned about privacy: 18% in 1990, down to 10% in 2004
  • How many online American adults are more concerned about their privacy in 2014 than they were a year ago, indicating rising privacy concerns: 64%
  • Number of respondents in 2012 who believe they have control over their personal information: 35%, downward trend for 7 years
  • How many respondents in 2012 continue to perceive privacy and the protection of their personal information as very important or important to the overall trust equation: 78%, upward trend for seven years
  • How many consumers in 2013 trust that their bank is committed to ensuring the privacy of their personal information is protected: 35%, down from 48% in 2004

Privacy Concerns and Beliefs

  • How many Internet users worry about their privacy online: 92%
    • Those who report that their level of concern has increased from 2013 to 2014: 7 in 10
    • How many are at least sometimes worried when shopping online: 93%, up from 89% in 2012
    • Those who have some concerns when banking online: 90%, up from 86% in 2012
  • Number of Internet users who are worried about the amount of personal information about them online: 50%, up from 33% in 2009
    • Those who report that their photograph is available online: 66%
      • Their birthdate: 50%
      • Home address: 30%
      • Cell number: 24%
      • A video: 21%
      • Political affiliation: 20%
  • Consumers who are concerned about companies tracking their activities: 58%
    • Those who are concerned about the government tracking their activities: 38%
  • How many users surveyed felt that the National Security Association (NSA) overstepped its bounds in light of recent NSA revelations: 44%
  • Respondents who are comfortable with advertisers using their web browsing history to tailor advertisements as long as it is not tied to any other personally identifiable information: 36%, up from 29% in 2012
  • Percentage of voters who do not want political campaigns to tailor their advertisements based on their interests: 86%
  • Percentage of respondents who do not want news tailored to their interests: 56%
  • Percentage of users who are worried about their information will be stolen by hackers: 75%
    • Those who are worried about companies tracking their browsing history for targeted advertising: 54%
  • How many consumers say they do not trust businesses with their personal information online: 54%
  • Top 3 most trusted companies for privacy identified by consumers from across 25 different industries in 2012: American Express, Hewlett Packard and Amazon
    • Most trusted industries for privacy: Healthcare, Consumer Products and Banking
    • Least trusted industries for privacy: Internet and Social Media, Non-Profits and Toys
  • Respondents who admit to sharing their personal information with companies they did not trust in 2012 for reasons such as convenience when making a purchase: 63%
  • Percentage of users who say they prefer free online services supported by targeted ads: 61%
    • Those who prefer paid online services without targeted ads: 33%
  • How many Internet users believe that it is not possible to be completely anonymous online: 59%
    • Those who believe complete online anonymity is still possible: 37%
    • Those who say people should have the ability to use the Internet anonymously: 59%
  • Percentage of Internet users who believe that current laws are not good enough in protecting people’s privacy online: 68%
    • Those who believe current laws provide reasonable protection: 24%

Security Related Issues

  • How many have had an email or social networking account compromised or taken over without permission: 21%
  • Those who have been stalked or harassed online: 12%
  • Those who think the federal government should do more to act against identity theft: 74%
  • Consumers who agree that they will avoid doing business with companies who they do not believe protect their privacy online: 89%
    • Among 65+ year old consumers: 96%

Privacy-Related Behavior

  • How many mobile phone users have decided not to install an app after discovering the amount of information it collects: 54%
  • Number of Internet users who have taken steps to remove or mask their digital footprint (including clearing cookies, encrypting emails, and using virtual networks to mask their IP addresses): 86%
  • Those who have set their browser to disable cookies: 65%
  • Number of users who have not allowed a service to remember their credit card information: 73%
  • Those who have chosen to block an app from accessing their location information: 53%
  • How many have signed up for a two-step sign-in process: 57%
  • Percentage of Gen-X (33-48 year olds) and Millennials (18-32 year olds) who say they never change their passwords or only change them when forced to: 41%
    • How many report using a unique password for each site and service: 4 in 10
    • Those who use the same password everywhere: 7%

Sources

Charities Try New Ways to Test Ideas Quickly and Polish Them Later


Ben Gose in the Chronicle of Philanthropy: “A year ago, a division of TechSoup Global began working on an app to allow donors to buy a hotel room for victims of domestic violence when no other shelter is available. Now that app is a finalist in a competition run by a foundation that combats human trafficking—and a win could mean a grant worth several hundred thousand dollars. The app’s evolution—adding a focus on sex slaves to the initial emphasis on domestic violence—was hardly accidental.
Caravan Studios, the TechSoup division that created the app, has embraced a new management approach popular in Silicon Valley known as “lean start-up.”
The principles, which are increasingly popular among nonprofits, emphasize experimentation over long-term planning and urge groups to get products and services out to clients as early as possible so the organizations can learn from feedback and make changes.
When the app, known as SafeNight, was still early in the design phase, Caravan posted details about the project on its website, including applications for grants that Caravan had not yet received. In lean-start-up lingo, Caravan put out a “minimal viable product” and hoped for feedback that would lead to a better app.
Caravan soon heard from antitrafficking organizations, which were interested in the same kind of service. Caravan eventually teamed up with the Polaris Project and the State of New Jersey, which were working on a similar app, to jointly create an app for the final round of the antitrafficking contest. Humanity United, the foundation sponsoring the contest, plans to award $1.8-million to as many as three winners later this month.
Marnie Webb, CEO of Caravan, which is building an array of apps designed to curb social problems, says lean-start-up principles help Caravan work faster and meet real needs.
“The central idea is that any product that we develop will get better if it lives as much of its life as possible outside of our office,” Ms. Webb says. “If we had kept SafeNight inside and polished it and polished it, it would have been super hard to bring on a partner because we would have invested too much.”….
Nonprofits developing new tech tools are among the biggest users of lean-start-up ideas.
Upwell, an ocean-conservation organization founded in 2011, scans the web for lively ocean-related discussions and then pushes to turn them into full-fledged movements through social-media campaigns.
Lean principles urge groups to steer clear of “vanity metrics,” such as site visits, that may sound impressive but don’t reveal much. Upwell tracks only one number—“social mentions”—the much smaller group of people who actually say something about an issue online.
After identifying a hot topic, Upwell tries to assemble a social-media strategy within 24 hours—what it calls a “minimum viable campaign.”
“We do the least amount of work to get something out the door that will get results and information,” says Rachel Dearborn, Upwell’s campaign director.
Campaigns that don’t catch on are quickly scrapped. But campaigns that do catch on get more time, energy, and money from Upwell.
After Hurricane Sandy, in 2012, a prominent writer on ocean issues and others began pushing the idea that revitalizing the oyster beds near New York City could help protect the shore from future storm surges. Upwell’s “I (Oyster) New York” campaign featured a catchy logo and led to an even bigger spike in attention.

‘Build-Measure-Learn’

Some organizations that could hardly be called start-ups are also using lean principles. GuideStar, the 20-year-old aggregator of financial information about charities, is using the lean approach to develop tools more quickly that meet the needs of its users.
The lean process promotes short “build-measure-learn” cycles, in which a group frequently updates a product or service based on what it hears from its customers.
GuideStar and the Nonprofit Finance Fund have developed a tool called Financial Scan that allows charities to see how they compare with similar groups on various financial measures, such as their mix of earned revenue and grant funds.
When it analyzed who was using the tool, GuideStar found heavy interest from both foundations and accounting firms, says Evan Paul, GuideStar’s senior director of products and marketing.
In the future, he says, GuideStar may create three versions of Financial Scan to meet the distinct interests of charities, foundations, and accountants.
“We want to get more specific about how people are using our data to make decisions so that we can help make those decisions better and faster,” Mr. Paul says….


Lean Start-Up: a Glossary of Terms for a Hot New Management Approach

Build-Measure-Learn

Instead of spending considerable time developing a product or service for a big rollout, organizations should consider using a continuous feedback loop: “build” a program or service, even if it is not fully fleshed out; “measure” how clients are affected; and “learn” by improving the program or going in a new direction. Repeat the cycle.

Minimum Viable Product

An early version of a product or service that may be lacking some features. This approach allows an organization to obtain feedback from clients and quickly determine the usefulness of a product or service and how to improve it.

Get Out of the Building

To determine whether a product or service is needed, talk to clients and share your ideas with them before investing heavily.

A/B Testing

Create two versions of a product or service, show them to different groups, and see which performs best.

Failing Fast

By quickly realizing that a product or service isn’t viable, organizations save time and money and gain valuable information for their next effort.

Pivot

Making a significant change in strategy when the early testing of a minimum viable product shows that the product or service isn’t working or isn’t needed.

Vanity Metrics

Measures that seem to provide a favorable picture but don’t accurately capture the impact of a product. An example might be a tally of website page views. A more meaningful measure—or an “actionable metric,” in the lean lexicon—might be the number of active users of an online service.
Sources: The Lean Startup, by Eric Ries; The Ultimate Dictionary of Lean for Social Good, a publication by Lean Impact”

The Unwisdom of Crowds




Anne Applebaum on why people-powered revolutions are overrated in the New Republic: “..Yet a successful street revolution, like any revolution, is never guaranteed to leave anything positive in its aftermath—or anything at all. In the West, we often now associate protests with progress, or at least we assume that big crowds—the March on Washington, Paris in 1968—are the benign face of social change. But street revolutions are not always progressive, positive, or even important. Some replace a corrupt tyranny with violence and a political vacuum, which is what happened in Libya. Ukraine’s own Orange Revolution of 2004–2005 produced a new group of leaders who turned out to be just as incompetent as their predecessors. Crowds can be bullying, they can become violent, and they can give rise to extremists: Think Tehran 1979, or indeed Petrograd 1917.
The crowd may not even represent the majority. Because a street revolution makes good copy, and because it provides great photographs, we often mistakenly confuse “people power” with democracy itself. In fact, the creation of democratic institutions—courts, legal systems, bills of rights—is a long and tedious process that often doesn’t interest foreign journalists at all. Tunisia’s ratification of a new constitution earlier this year represented the most significant achievement of the Arab Spring to date, but the agonizing negotiations that led up to that moment were hard for outsiders to understand—and not remotely telegenic
Equally, it is a dangerous mistake to imagine that “people power” can ever be a substitute for actual elections. On television, a demonstration can loom larger than it should. In both Thailand and Turkey, an educated middle class has recently taken to the streets to protest against democratically elected leaders who have grown increasingly corrupt and autocratic, but who might well be voted back into office tomorrow. In Venezuela, elections are not fair and the media is not free, but the president is supported by many Venezuelans who still have faith in his far-left rhetoric, however much his policies may be damaging the country. Demonstrations might help bring change in some of these countries, but if the change is to be legitimate—and permanent—the electorate will eventually have to endorse it.
As we often forget, some of the most successful transitions to democracy did not involve crowds at all. Chile became a democracy because its dictator, Augusto Pinochet, decided it would become one. In early 1989, well before mass demonstrations in Prague or Berlin, the leaders of the Polish opposition sat down at a large round table with their former jailers and negotiated their way out of communism. There are no spectacular photographs of these transitions, and many people found them unsatisfying, even unjust. But Chile and Poland remain democracies today, not least because their new leaders came to power without any overt opposition from the old regime.
It would be nice if these kinds of transitions were more common, but not every dictator is willing to smooth the path toward change. For that reason, the post-revolutionary moment is often more important than the revolution itself, for this is when the emotion of the mob has to be channeled rapidly—immediately—into legitimate institutions. Not everybody finds this easy. In the wake of the Egyptian revolution, demonstrators found it difficult to abandon Tahrir Square. “We won’t leave because we have to make sure this country is set on the right path,” one protester said at the time. In fact, he should already have been at home, back in his neighborhood, perhaps creating the grassroots political party that might have given Egyptians a real alternative to the Muslim Brotherhood…”

How Twitter Could Help Police Departments Predict Crime


Eric Jaffe in Atlantic Cities: “Initially, Matthew Gerber didn’t believe Twitter could help predict where crimes might occur. For one thing, Twitter’s 140-character limit leads to slang and abbreviations and neologisms that are hard to analyze from a linguistic perspective. Beyond that, while criminals occasionally taunt law enforcement via Twitter, few are dumb or bold enough to tweet their plans ahead of time. “My hypothesis was there was nothing there,” says Gerber.
But then, that’s why you run the data. Gerber, a systems engineer at the University of Virginia’s Predictive Technology Lab, did indeed find something there. He reports in a new research paper that public Twitter data improved the predictions for 19 of 25 crimes that occurred early last year in metropolitan Chicago, compared with predictions based on historical crime patterns alone. Predictions for stalking, criminal damage, and gambling saw the biggest bump…..
Of course, the method says nothing about why Twitter data improved the predictions. Gerber speculates that people are tweeting about plans that correlate highly with illegal activity, as opposed to tweeting about crimes themselves.
Let’s use criminal damage as an example. The algorithm identified 700 Twitter topics related to criminal damage; of these, one topic involved the words “united center blackhawks bulls” and so on. Gather enough sports fans with similar tweets and some are bound to get drunk enough to damage public property after the game. Again this scenario extrapolates far more than the data tells, but it offers a possible window into the algorithm’s predictive power.

The map on the left shows predicted crime threat based on historical patterns; the one on the right includes Twitter data. (Via Decision Support Systems)
From a logistical standpoint, it wouldn’t be too difficult for police departments to use this method in their own predictions; both the Twitter data and modeling software Gerber used are freely available. The big question, he says, is whether a department used the same historical crime “hot spot” data as a baseline for comparison. If not, a new round of tests would have to be done to show that the addition of Twitter data still offered a predictive upgrade.
There’s also the matter of public acceptance. Data-driven crime prediction tends to raise any number of civil rights concerns. In 2012, privacy advocates criticized the FBI for a similar plan to use Twitter for crime predictions. In recent months the Chicago Police Department’s own methods have been knocked as a high-tech means of racial profiling. Gerber says his algorithms don’t target any individuals and only cull data posted voluntarily to a public account.”

After the Protests


Zeynep Tufekc in the New York Times on why social media is fueling a boom-and-bust cycle of political: “LAST Wednesday, more than 100,000 people showed up in Istanbul for a funeral that turned into a mass demonstration. No formal organization made the call. The news had come from Twitter: Berkin Elvan, 15, had died. He had been hit in the head by a tear-gas canister on his way to buy bread during the Gezi protests last June. During the 269 days he spent in a coma, Berkin’s face had become a symbol of civic resistance shared on social media from Facebook to Instagram, and the response, when his family tweeted “we lost our son” and then a funeral date, was spontaneous.

Protests like this one, fueled by social media and erupting into spectacular mass events, look like powerful statements of opposition against a regime. And whether these take place in Turkey, Egypt or Ukraine, pundits often speculate that the days of a ruling party or government, or at least its unpopular policies, must be numbered. Yet often these huge mobilizations of citizens inexplicably wither away without the impact on policy you might expect from their scale.

This muted effect is not because social media isn’t good at what it does, but, in a way, because it’s very good at what it does. Digital tools make it much easier to build up movements quickly, and they greatly lower coordination costs. This seems like a good thing at first, but it often results in an unanticipated weakness: Before the Internet, the tedious work of organizing that was required to circumvent censorship or to organize a protest also helped build infrastructure for decision making and strategies for sustaining momentum. Now movements can rush past that step, often to their own detriment….

But after all that, in the approaching local elections, the ruling party is expected to retain its dominance.

Compare this with what it took to produce and distribute pamphlets announcing the Montgomery bus boycott in 1955. Jo Ann Robinson, a professor at Alabama State College, and a few students sneaked into the duplicating room and worked all night to secretly mimeograph 52,000 leaflets to be distributed by hand with the help of 68 African-American political, religious, educational and labor organizations throughout the city. Even mundane tasks like coordinating car pools (in an era before there were spreadsheets) required endless hours of collaborative work.

By the time the United States government was faced with the March on Washington in 1963, the protest amounted to not just 300,000 demonstrators but the committed partnerships and logistics required to get them all there — and to sustain a movement for years against brutally enforced Jim Crow laws. That movement had the capacity to leverage boycotts, strikes and demonstrations to push its cause forward. Recent marches on Washington of similar sizes, including the 50th anniversary march last year, also signaled discontent and a desire for change, but just didn’t pose the same threat to the powers that be.

Social media can provide a huge advantage in assembling the strength in numbers that movements depend on. Those “likes” on Facebook, derided as slacktivism or clicktivism, can have long-term consequences by defining which sentiments are “normal” or “obvious” — perhaps among the most important levers of change. That’s one reason the same-sex marriage movement, which uses online and offline visibility as a key strategy, has been so successful, and it’s also why authoritarian governments try to ban social media.

During the Gezi protests, Prime Minister Recep Tayyip Erdogan called Twitter and other social media a “menace to society.” More recently, Turkey’s Parliament passed a law greatly increasing the government’s ability to censor online content and expand surveillance, and Mr. Erdogan said he would consider blocking access to Facebook and YouTube. It’s also telling that one of the first moves by President Vladimir V. Putin of Russia before annexing Crimea was to shut down the websites of dissidents in Russia.
Media in the hands of citizens can rattle regimes. It makes it much harder for rulers to maintain legitimacy by controlling the public sphere. But activists, who have made such effective use of technology to rally supporters, still need to figure out how to convert that energy into greater impact. The point isn’t just to challenge power; it’s to change it.”

How Open Data Policies Unlock Innovation


Tim Cashman at Socrata: “Several trends made the Web 2.0 world we now live in possible. Arguably, the most important of these has been the evolution of online services as extensible technology platforms that enable users, application developers, and other collaborators to create value that extends far beyond the original offering itself.

The Era of ‘Government-as-a-Platform’

The same principles that have shaped the consumer web are now permeating government. Forward-thinking public sector organizations are catching on to the idea that, to stay relevant and vital, governments must go beyond offering a few basic services online. Some have even come to the realization that they are custodians of an enormously valuable resource: the data they collect through their day-to-day operations.  By opening up this data for public consumption online, innovative governments are facilitating the same kind of digital networks that consumer web services have fostered for years.  The era of government as a platform is here, and open data is the catalyst.

The Role of Open Data Policy in Unlocking Innovation in Government

The open data movement continues to transition from an emphasis on transparency to measuring the civic and economic impact of open data programs. As part of this transition, governments are realizing the importance of creating a formal policy to define strategic goals, describe the desired benefits, and provide the scope for data publishing efforts over time.  When well executed, open data policies yield a clear set of benefits. These range from spurring slow-moving bureaucracies into action to procuring the necessary funding to sustain open data initiatives beyond a single elected official’s term.

Four Types of Open Data Policies

There are four main types of policy levers currently in use regarding open data: executive orders, non-binding resolutions, new laws, new regulations, and codified laws. Each of these tools has specific advantages and potential limitations.

Executive Orders

The prime example of an open data executive order in action is President Barack Obama’s Open Data Initiative. While this executive order was short – only four paragraphs on two pages – the real policy magic was a mandate-by-reference that required all U.S. federal agencies to comply with a detailed set of time-bound actions. All of these requirements are publicly viewable on a GitHub repository – a free hosting service for open source software development projects – which is revolutionary in and of itself. Detailed discussions on government transparency took place not in closed-door boardrooms, but online for everyone to see, edit, and improve.

Non-Binding Resolutions

A classic example of a non-binding resolution can be found by doing an online search for the resolution of Palo Alto, California. Short and sweet, this town squire-like exercise delivers additional attention to the movement inside and outside of government. The lightweight policy tool also has the benefit of lasting a bit longer than any particular government official. Although, in recognition of the numerous resolutions that have ever come out of any small town, resolutions are only as timeless as people’s memory.

Internal Regulations

The New York State Handbook on Open Data is a great example of internal regulations put to good use. Originating from the Office of Information Technology Resources, the handbook is a comprehensive, clear, and authoritative guide on how open data is actually supposed to work. Also available on GitHub, the handbook resembles the federal open data project in many ways.

Codified Laws

The archetypal example of open data law comes from San Francisco.
Interestingly, what started as an “Executive Directive” from Mayor Gavin Newsom later turned into legislation and brought with it the power of stronger department mandates and a significant budget. Once enacted, laws are generally hard to revise. However, in the case of San Francisco, the city council has already revised the law two times in four years.
At the federal government level, the Digital Accountability and Transparency Act, or DATA Act, was introduced in both the U.S. House of Representatives (H.R. 2061) and the U.S. Senate (S. 994) in 2013. The act mandates the standardization and publication of a wide of variety of the federal government’s financial reports as open data. Although the Housed voted to pass the Data Act, it still awaits a vote in the Senate.

The Path to Government-as-a-Platform

Open data policies are an effective way to motivate action and provide clear guidance for open data programs. But they are not a precondition for public-sector organizations to embrace the government-as-a-platform model. In fact, the first step does not involve technology at all. Instead, it involves government leaders realizing that public data belongs to the people. And, it requires the vision to appreciate this data as a shared resource that only increases in value the more widely it is distributed and re-used for analytics, web and mobile apps, and more.
The consumer web has shown the value of open data networks in spades (think Facebook). Now, it’s government’s turn to create the next web.”