New Report Finds Cost-Benefit Analyses Improve Budget Choices & Taxpayer Results


Press Release: “A new report shows cost-benefit analyses have helped states make better investments of public dollars by identifying programs and policies that deliver high returns. However, the majority of states are not yet consistently using this approach when making critical decisions. This 50-state look at cost-benefit analysis, a method that compares the expense of public programs to the returns they deliver, was released today by the Pew-MacArthur Results First Initiative, a project of The Pew Charitable Trusts and the John D. and Catherine T. MacArthur Foundation.

The study, “States’ Use of Cost-benefit Analysis: Improving Results for Taxpayers”, comes at a time when states are under continuing pressure to direct limited dollars toward the most cost-effective programs and policies while curbing spending on those that do not deliver. The report is the first comprehensive study of how all 50 states and the District of Columbia analyze the costs and benefits of programs and policies, report findings, and incorporate the assessments into decision-making. It identifies key challenges states face in conducting and using the analyses and offers strategies to overcome those obstacles. The study includes a review of state statutes, a search for cost benefit analyses released between 2008 and 2011, and interviews with legislators, legislative and program evaluation staff, executive officials, report authors, and agency officials.”

If Your Government Fails, Can You Create a New One With Your Phone?


Philip Howard in the Atlantic: “Wherever governments are in crisis, in transition, or in absentia, people are using digital media to try to improve their condition, to build new organizations, and to craft new institutional arrangements. Technology is, in a way, enabling new kinds of states.
It is out of vogue in Washington to refer to failed states. But regardless of the term, there are an unfortunate number of places where governments have ceased to function, creating openings for these new institutional arrangements to flourish. Indeed, state failure doesn’t always take the form of a catastrophic and complete collapse in government. States can fail at particular moments, such as during a natural disaster or an election. States can also fail in particular domains, such as in tax collection.
Information technologies like cellphones and the Internet are generating small acts of self-governance in a wide range of domains and in surprising places.”

Orwell is drowning in data: the volume problem


Dom Shaw in OpenDemocracy: “During World War II, whilst Bletchley Park laboured in the front line of code breaking, the British Government was employing vast numbers of female operatives to monitor and report on telephone, mail and telegraph communications in and out of the country.
The biggest problem, of course, was volume. Without even the most primitive algorithm to detect key phrases that later were to cause such paranoia amongst the sixties and seventies counterculture, causing a whole generation of drug users to use a wholly unnecessary set of telephone synonyms for their desired substance, the army of women stationed in exchanges around the country was driven to report everything and then pass it on up to those whose job it was to analyse such content for significance.
Orwell’s vision of Big Brother’s omniscience was based upon the same model – vast armies of Winston Smiths monitoring data to ensure discipline and control. He saw a culture of betrayal where every citizen was held accountable for their fellow citizens’ political and moral conformity.
Up until the US Government’s Big Data Research and Development Initiative [12] and the NSA development of the Prism programme [13], the fault lines always lay in the technology used to collate or collect and the inefficiency or competing interests of the corporate systems and processes that interpreted the information. Not for the first time, the bureaucracy was the citizen’s best bulwark against intrusion.
Now that the algorithms have become more complex and the technology tilted towards passive surveillance through automation, the volume problem becomes less of an obstacle….
The technology for obtaining this information, and indeed the administration of it, is handled by corporations. The Government, driven by the creed that suggests private companies are better administrators than civil servants, has auctioned off the job to a dozen or more favoured corporate giants who are, as always, beholden not only to their shareholders, but to their patrons within the government itself….
The only problem the state had was managing the scale of the information gleaned from so many people in so many forms. Not any more. The volume problem has been overcome.”

The Danger of Human Rights Proliferation


Jacob Mchangama and Guglielmo Verdirame in Foreign Affairs on “When Defending Liberty, Less Is More“: “If human rights were a currency, its value would be in free fall, thanks to a gross inflation in the number of human rights treaties and nonbinding international instruments adopted by international organizations over the last several decades. These days, this currency is sometimes more likely to buy cover for dictatorships than protection for citizens. Human rights once enshrined the most basic principles of human freedom and dignity; today, they can include anything from the right to international solidarity to the right to peace.Consider just how enormous the body of binding human rights law has become. The Freedom Rights Project, a research group that we co-founded, counts a full 64 human-rights-related agreements under the auspices of the United Nations and the Council of Europe. A member state of both of these organizations that has ratified all these agreements would have to comply with 1,377 human rights provisions (although some of these may be technical rather than substantive). Add to this the hundreds of non-treaty instruments, such as the resolutions of the UN General Assembly and Human Rights Council (HRC). The aggregate body of human rights law now has all the accessibility of a tax code.
Supporters of human rights should worry about this explosion of regulation. If people are to demand human rights, then they must first be able to understand them — a tall order given the current bureaucratic tangle of administrative regulation…”

Code for America: Announcing the 2013 Accelerator Class


Press Release: “Code for America opened applications for the 2013 Accelerator knowing that the competition would be fierce. This year we received over 190 applications from amazing candidates. Today, we’re pleased to announce the five teams chosen to participate in the 2013 Accelerator.

The teams are articulate, knowledgeable, and passionate about their businesses. They come from all over the country — Texas, North Carolina, Florida, and California  — and we’re excited to get started with them. Teams include:

ArchiveSocial enables organizations to embrace social media by minimizing risk and eliminating compliance barriers. Specifically, it solves the challenge of retaining Gov 2.0 communications for compliance with FOIA and other public records laws. It currently automates business-grade record keeping of communications on networks such as Facebook, Twitter, and YouTube. Moving forward, ArchiveSocial will help further enforce social media policy and protect the organizational brand.

The Family Assessment Form (FAF) Web is a tool designed by social workers, researchers, and technology experts to help family support practitioners improve family functioning, service planning for families, and organizational performance. The FAF is ideal for use in organizations performing home visitation services for families that address comprehensive concerns about family well-being and child welfare. FAF Web enables all stakeholders to access essential data remotely from any internet-enabled device.

OpenCounter helps entrepreneurs to register their businesses with the local government. It does so through an online check-out experience that adapts to the applicant’s answers and asks for pertinent information only once. OpenCounter estimates licensing time and costs so entrepreneurs can understand what it will take to get their business off the ground. It’s the TurboTax of business permitting.

SmartProcure is an online information service that provides access to local, state, and federal government procurement data, with two public-interest goals: 1. Enable government agencies to make more efficient procurement decisions and save taxpayer dollars. 2. Empower businesses to sell more effectively and competitively to government agencies. The proprietary system provides access to data from more than 50 million purchase orders issued by 1,700 government agencies.

StreetCred Software helps police agencies manage their arrest warrants, eliminate warrant backlogs, and radically improve efficiency while increasing officer safety. It helps agencies understand their fugitive population, measure effectiveness, and make improvements. StreetCred Software, Inc., was founded by two Texas police officers. One is an 18-year veteran investigator and fugitive hunter, the other a technology industry veteran who became an cop in 2010.”

5 Big Data Projects That Could Impact Your Life


Mashable: “We reached out to a few organizations using information, both hand- and algorithm-collected, to create helpful tools for their communities. This is only a small sample of what’s out there — plenty more pop up each day, and as more information becomes public, the trend will only grow….
1. Transit Time NYC
Transit Time NYC, an interactive map developed by WNYC, lets New Yorkers click a spot in any of the city’s five boroughs for an estimate of subway or train travel times. To create it, WNYC lead developer Steve Melendez broke the city into 2,930 hexagons, then pulled data from open source itinerary platform OpenTripPlanner — the Wikipedia of mapping software — and coupled it with the MTA’s publicly downloadable subway schedule….
2. Twitter’s ‘Topography of Tweets
In a blog post, Twitter unveiled a new data visualization map that displays billions of geotagged tweets in a 3D landscape format. The purpose is to display, topographically, which parts of certain cities most people are tweeting from…
3. Homicide Watch D.C.
Homicide Watch D.C. is a community-driven data site that aims to cover every murder in the District of Columbia. It’s sorted by “suspect” and “victim” profiles, where it breaks down each person’s name, age, gender and race, as well as original articles reported by Homicide Watch staff…
4. Falling Fruit
Can you find a hidden apple tree along your daily bike commute? Falling Fruit can.
The website highlights overlooked or hidden edibles in urban areas across the world. By collecting public information from the U.S. Department of Agriculture, municipal tree inventories, foraging maps and street tree databases, the site has created a network of 615 types of edibles in more than 570,000 locations. The purpose is to remind urban dwellers that agriculture does exist within city boundaries — it’s just more difficult to find….
5. AIDSvu
AIDSVu is an interactive map that illustrates the prevalence of HIV in the United States. The data is pulled from the U.S. Center for Disease Control’s national HIV surveillance reports, which are collected at both state and county levels each year…”

Facebook Is Being Redefined by Its Developing World Users


Tom Simonite in MIT Technology Review: “…as Facebook’s user base continues to expand, a growing proportion of its users think of it quite differently, as a luxury brand, badge of status, and or even a place to make a little extra money. That’s due to the rapid growth in the number of Facebook users signing on from developing countries, a trend underscored by news from the company today that more than 100 million people use a mobile app the company makes for feature phones
Little research has been done on Facebook’s growth in developing countries (and a lot would be needed to capture even some of the diversity included under the blanket term “developing world”). Two small, recent studies of Kenyan Facebook users in poor areas by Susan Wyche of Michigan State University are among the first to be published, and they provide some interesting insights.
One of Wyche’s ethnographic studies took place in rural Internet cafes, where the researchers were told that “Facebook is a luxury,” only to be indulged if someone had money to spare (here’s a PDF of Wyche’s paper). When study participants thought about social networking, the challenges of low bandwidth and sometimes unreliable electricity supplies were foremost in their minds.
The barriers of cost and infrastructure associated with Facebook led people in another community Wyche and colleagues visited, a slum of Nairobi, to see the service as for more than just socializing. They used it—with mixed success—as a way to make a little money, look for jobs, market themselves, and seek remittances from friends and family overseas. (This reminded me of a recent report on people in Kuwait using Instagram to sell things and run retail businesses.)…
Should it want to, Facebook could even become a powerful tool for efforts to improve the lives of people in poor areas, where the site is gaining traction. The company has already dabbled with using social engineering to boost organ donations in the U.S. (see “Thank God for Facebook: When Platforms Proselytize”). There’s no shortage of similar experiments that could be run in places with more fundamental health problems, where Facebook’s status as a luxury could make it very influential.”

Participatory Democracy in the New Millenium


New literature review in Contemporary Sociology by Francesca Polletta: “By the 1980s, experiments in participatory democracy seemed to have been relegated by scholars to the category of quixotic exercises in idealism, undertaken by committed (and often aging) activists who were unconcerned with political effectiveness or economic efficiency. Today, bottom-up decision making seems all the rage. Crowdsourcing and Open Source, flat management in business, horizontalism in protest politics, collaborative governance in policymaking—these are the buzzwords now and they are all about the virtues of nonhierarchical and participatory decision making.

What accounts for this new enthusiasm for radical democracy? Is it warranted? Are champions of this form understanding key terms like equality and consensus differently than did radical democrats in the 1960s and 70s? And is there any reason to believe that today’s radical democrats are better equipped than their forebears to avoid the old dangers of endless meetings and rule by friendship cliques? In this admittedly selective review, I will take up recent books on participatory democracy in social movements, non- and for-profit organizations, local governments, and electoral campaigning. These are perhaps not the most influential books on participatory democracy since 2000—after all, most of them are brand new—but they speak interestingly to the state of participatory democracy today. Taken together, they suggest that, on one hand, innovations in technology and in activism have made democratic decision making both easier and fairer. On the other hand, the popularity of radical democracy may be diluting its force. If radical democracy comes to mean simply public participation, then spectacles of participation may be made to stand in for mechanisms of democratic accountability.”

The 20 Basics of Open Government


About The 20 Basics of Open Government: “The 20 Basics of Open Government was created with digital love and sweat by the Open Forum Foundation. We did this primarily because it didn’t exist, but really needed to. As we started looking around, we also realized that the terminology of open government is used by a lot of different people to mean a lot of different things. For example, there are multiple groupings of transparency advocates each with their own perspective, there’s the participation community, and then more generally there are techies and govies, each of which use different languages normally anyway.

Watching what is going on around the world in national, state, and local governments, we think opengov is maturing and that the time has come for a basics resource for newbies. Our goal was to include the full expanse of open government and show how it all ties together so that when you, the astute reader, meet up with one of the various opengov cliques that uses the terminology in a narrowly defined way, you can see how they fit into the bigger picture. You should also be able to determine how opengov can best be applied to benefit whatever you’re up to, while keeping in mind the need to provide both access for citizens to engage with government and access to information.
Have a read through it, and let us know what you think! When you find a typo – or something you disagree with – or something we missed, let us know that as well. The easiest way to do it is right there in the comments (we’re not afraid to be called out in public!), but we’re open to email and twitter as well. We’re looking forward to hearing what you think!.”

Frontiers in Massive Data Analysis


New Report from the National Research Council: “From Facebook to Google searches to bookmarking a webpage in our browsers, today’s society has become one with an enormous amount of data. Some internet-based companies such as Yahoo! are even storing exabytes (10 to the 18 bytes) of data. Like these companies and the rest of the world, scientific communities are also generating large amounts of data-—mostly terabytes and in some cases near petabytes—from experiments, observations, and numerical simulation. However, the scientific community, along with defense enterprise, has been a leader in generating and using large data sets for many years. The issue that arises with this new type of large data is how to handle it—this includes sharing the data, enabling data security, working with different data formats and structures, dealing with the highly distributed data sources, and more.
Frontiers in Massive Data Analysis presents the Committee on the Analysis of Massive Data’s work to make sense of the current state of data analysis for mining of massive sets of data, to identify gaps in the current practice and to develop methods to fill these gaps. The committee thus examines the frontiers of research that is enabling the analysis of massive data which includes data representation and methods for including humans in the data-analysis loop. The report includes the committee’s recommendations, details concerning types of data that build into massive data, and information on the seven computational giants of massive data analysis.”