This App Lets You See The Tough Choices Needed To Balance Your City’s Budget


Jay Cassano at FastCoExist: “Ask the average person on the street how much money their city spends on education or health care or police. Even the most well-informed probably won’t be able to come up with a dollar amount. That’s because even if you are interested, municipal budgets aren’t presented in a way that makes sense to ordinary people.

Balancing Act is a web app that displays a straightforward pie chart of a city’s budget, broken down into categories like pensions, parks & recreations, police, and education. But it doesn’t just display the current budget breakdown. It invites users to tweak it, expressing their own priorities, all while keeping the city in the black. Do you want your libraries to be better funded? Fine—but you’re going to have to raise property taxes to do it.

“Balancing Act provides a way for people to both understand what public entities are doing and then to weight that against the other possible things that government can do,” says Chris Adams, president of Engaged Public, a Colorado-based consulting firm that develops technology for government and non-profits. “Especially in this era of information, all of us have a responsibility to spend a bit of time understanding how our government is spending money on our behalf.”

Hartford, Connecticut is the first city in the country that is using Balancing Act. The city was facing a $49 million budget deficit this spring, and Mayor Pedro Segarra says he took input from citizens using Balancing Act. Meanwhile, in Engaged Public’s home state, residents can input their income to generate an itemized tax receipt and then tweak the Colorado state budget as they see fit.

Engaged Public hopes that by making budgets more interactive and accessible, more people will take an interest in them.

“Budget information almost universally exists, but it’s not in accessible formats—mostly they’re in PDF files,” says Adams. “So citizens are invited to pour through tens of thousands of pages of PDFs. But that really doesn’t give you a high-level understanding of what’s at stake in a reasonable amount of time.”

If widely used, Balancing Act could be a useful tool for politicians to check the pulse of their constituents. For example, decreasing funding to parks draws a negative public reaction. But if enough people on Balancing Act experimented with the budget, saw the necessity of it, and submitted their recommendations, then an elected might be willing to make a decision that would otherwise seem politically risky….(More)”

When Guarding Student Data Endangers Valuable Research


Susan M. Dynarski  in the New York Times: “There is widespread concern over threats to privacy posed by the extensive personal data collected by private companies and public agencies.

Some of the potential danger comes from the government: The National Security Agency has swept up the telephone records of millions of people, in what it describes as a search for terrorists. Other threats are posed by hackers, who have exploited security gaps to steal data from retail giantslike Target and from the federal Office of Personnel Management.

Resistance to data collection was inevitable — and it has been particularly intense in education.

Privacy laws have already been strengthened in some states, and multiple bills now pending in state legislatures and in Congress would tighten the security and privacy of student data. Some of this proposed legislation is so broadly written, however, that it could unintentionally choke off the use of student data for its original purpose: assessing and improving education. This data has already exposed inequities, allowing researchers and advocates to pinpoint where poor, nonwhite and non-English-speaking children have been educated inadequately by their schools.

Data gathering in education is indeed extensive: Across the United States, large, comprehensive administrative data sets now track the academic progress of tens of millions of students. Educators parse this data to understand what is working in their schools. Advocates plumb the data to expose unfair disparities in test scores and graduation rates, building cases to target more resources for the poor. Researchers rely on this data when measuring the effectiveness of education interventions.

To my knowledge there has been no large-scale, Target-like theft of private student records — probably because students’ test scores don’t have the market value of consumers’ credit card numbers. Parents’ concerns have mainly centered not on theft, but on the sharing of student data with third parties, including education technology companies. Last year, parentsresisted efforts by the tech start-up InBloom to draw data on millions of students into the cloud and return it to schools as teacher-friendly “data dashboards.” Parents were deeply uncomfortable with a third party receiving and analyzing data about their children.

In response to such concerns, some pending legislation would scale back the authority of schools, districts and states to share student data with third parties, including researchers. Perhaps the most stringent of these proposals, sponsored by Senator David Vitter, a Louisiana Republican, would effectively end the analysis of student data by outside social scientists. This legislation would have banned recent prominent research documenting the benefits of smaller classes, the value of excellent teachersand the varied performance of charter schools.

Under current law, education agencies can share data with outside researchers only to benefit students and improve education. Collaborations with researchers allow districts and states to tap specialized expertise that they otherwise couldn’t afford. The Boston public school district, for example, has teamed up with early-childhood experts at Harvard to plan and evaluate its universal prekindergarten program.

In one of the longest-standing research partnerships, the University of Chicago works with the Chicago Public Schools to improve education. Partnerships like Chicago’s exist across the nation, funded by foundations and the United States Department of Education. In one initiative, a Chicago research consortium compiled reports showing high school principals that many of the seniors they had sent off to college swiftly dropped out without earning a degree. This information spurred efforts to improve high school counseling and college placement.

Specific, tailored information in the hands of teachers, principals or superintendents empowers them to do better by their students. No national survey could have told Chicago’s principals how their students were doing in college. Administrative data can provide this information, cheaply and accurately…(More)”

Introducing the Governance Data Alliance


“The overall assumption of the Governance Data Alliance is that governance data can contribute to improved sustainable economic and human development outcomes and democratic accountability in all countries. The contribution that governance data will make to those outcomes will of course depend on a whole range of issues that will vary across contexts; development processes, policy processes, and the role that data plays vary considerably. Nevertheless, there are some core requirements that need to be met if data is to make a difference, and articulating them can provide a framework to help us understand and improve the impact that data has on development and accountability across different contexts.

We also collectively make another implicit (and important) assumption: that the current state of affairs is vastly insufficient when it comes to the production and usage of high-quality governance data. In other words, the status quo needs to be significantly improved upon. Data gathered from participants in the April 2014 design session help to paint that picture in granular terms. Data production remains highly irregular and ad hoc; data usage does not match data production in many cases (e.g. users want data that don’t exist and do not use data that is currently produced); production costs remain high and inconsistent across producers despite possibilities for economies of scale; and feedback loops between governance data producers and governance data users are either non-existent or rarely employed. We direct readers to http://dataalliance.globalintegrity.org for a fuller treatment of those findings.

Three requirements need to be met if governance data is to lead to better development and accountability outcomes, whether those outcomes are about core “governance” issues such as levels of inclusion, or about service delivery and human development outcomes that may be shaped by the quality of governance. Those requirements are:

  • The availability of governance data.
  • The quality of governance data, including its usability and salience.
  • The informed use of governance data.

(Or to use the metaphor of markets, we face a series of market failures: supply of data is inconsistent and not uniform; user demand cannot be efficiently channeled to suppliers to redirect their production to address those deficiencies; and transaction costs abound through non-existent data standards and lack of predictability.)

If data are not available about those aspects of governance that are expected to have an impact on development outcomes and democratic accountability, no progress will be made. The risk is that data about key issues will be lacking, or that there will be gaps in coverage, whether country coverage, time periods covered, or sectors, or that data sets produced by different actors may not be comparable. This might come about for reasons including the following: a lack of knowledge – amongst producers, and amongst producers and users – about what data is needed and what data is available; high costs, and limited resources to invest in generating data; and, institutional incentives and structures (e.g. lack of autonomy, inappropriate mandate, political suppression of sensitive data, organizational dysfunction – relating, for instance, to National Statistical Offices) that limit the production of governance data….

What A Governance Data Alliance Should Do (Or, Making the Market Work)

During the several months of creative exploration around possibilities for a Governance Data Alliance, dozens of activities were identified as possible solutions (in whole or in part) to the challenges identified above. This note identifies what we believe to be the most important and immediate activities that an Alliance should undertake, knowing that other activities can and should be rolled into an Alliance work plan in the out years as the initiative matures and early successes (and failures) are achieved and digested.

A brief summary of the proposals that follow:

  1. Design and implement a peer-to-peer training program between governance data producers to improve the quality and salience of existing data.
  2. Develop a lightweight data standard to be adopted by producer organizations to make it easier for users to consume governance data.
  3. Mine the 2014 Reform Efforts Survey to understand who actually uses which governance data, currently, around the world.
  4. Leverage the 2014 Reform Efforts Survey “plumbing” to field customized follow-up surveys to better assess what data users seek in future governance data.
  5. Pilot (on a regional basis) coordinated data production amongst producer organizations to fill coverage gaps, reduce redundancies, and respond to actual usage and user preferences….(More) “

Confidence in U.S. Institutions Still Below Historical Norms


Jeffrey M. Jones at Gallup: “Americans’ confidence in most major U.S. institutions remains below the historical average for each one. Only the military (72%) and small business (67%) — the highest-rated institutions in this year’s poll — are currently rated higher than their historical norms, based on the percentage expressing “a great deal” or “quite a lot” of confidence in the institution.

Confidence in U.S. Institutions, 2015 vs. Historical Average for Each Institution

These results are based on a June 2-7 Gallup poll that included Gallup’s latest update on confidence in U.S. institutions. Gallup first measured confidence ratings in 1973 and has updated them each year since 1993.

Americans’ confidence in most major institutions has been down for many years as the nation has dealt with prolonged wars in Iraq and Afghanistan, a major recession and sluggish economic improvement, and partisan gridlock in Washington. In fact, 2004 was the last year most institutions were at or above their historical average levels of confidence. Perhaps not coincidentally, 2004 was also the last year Americans’ satisfaction with the way things are going in the United States averaged better than 40%. Currently, 28% of Americans are satisfied with the state of the nation.

From a broad perspective, Americans’ confidence in all institutions over the last two years has been the lowest since Gallup began systematic updates of a larger set of institutions in 1993. The average confidence rating of the 14 institutions asked about annually since 1993 — excluding small business, asked annually since 2007 — is 32% this year. This is one percentage point above the all-institution average of 31% last year. Americans were generally more confident in all institutions in the late 1990s and early 2000s as the country enjoyed a strong economy and a rally in support for U.S. institutions after the 9/11 terrorist attacks.

Trend: Average Confidence Rating Across All Institutions, by Year

Confidence in Political, Financial and Religious Institutions Especially Low

Today’s confidence ratings of Congress, organized religion, banks, the Supreme Court and the presidency show the greatest deficits compared with their historical averages, all running at least 10 points below that mark. Americans’ frustration with the government’s performance has eroded the trust they have in all U.S. political institutions….(More)”

How Crowdsourcing Can Help Us Fight ISIS


 at the Huffington Post: “There’s no question that ISIS is gaining ground. …So how else can we fight ISIS? By crowdsourcing data – i.e. asking a relevant group of people for their input via text or the Internet on specific ISIS-related issues. In fact, ISIS has been using crowdsourcing to enhance its operations since last year in two significant ways. Why shouldn’t we?

First, ISIS is using its crowd of supporters in Syria, Iraq and elsewhere to help strategize new policies. Last December, the extremist group leveraged its global crowd via social media to brainstorm ideas on how to kill 26-year-old Jordanian coalition fighter pilot Moaz al-Kasasba. ISIS supporters used the hashtag “Suggest a Way to Kill the Jordanian Pilot Pig” and “We All Want to Slaughter Moaz” to make their disturbing suggestions, which included decapitation, running al-Kasasba over with a bulldozer and burning him alive (which was the winner). Yes, this sounds absurd and was partly a publicity stunt to boost ISIS’ image. But the underlying strategy to crowdsource new strategies makes complete sense for ISIS as it continues to evolve – which is what the US government should consider as well.

In fact, in February, the US government tried to crowdsource more counterterrorism strategies. Via its official blog, DipNote, the State Departmentasked the crowd – in this case, US citizens – for their suggestions for solutions to fight violent extremism. This inclusive approach to policymaking was obviously important for strengthening democracy, with more than 180 entries posted over two months from citizens across the US. But did this crowdsourcing exercise actually improve US strategy against ISIS? Not really. What might help is if the US government asked a crowd of experts across varied disciplines and industries about counterterrorism strategies specifically against ISIS, also giving these experts the opportunity to critique each other’s suggestions to reach one optimal strategy. This additional, collaborative, competitive and interdisciplinary expert insight can only help President Obama and his national security team to enhance their anti-ISIS strategy.

Second, ISIS has been using its crowd of supporters to collect intelligence information to better execute its strategies. Since last August, the extremist group has crowdsourced data via a Twitter campaign specifically on Saudi Arabia’s intelligence officials, including names and other personal details. This apparently helped ISIS in its two suicide bombing attacks during prayers at a Shite mosque last month; it also presumably helped ISIS infiltrate a Saudi Arabian border town via Iraq in January. This additional, collaborative approach to intelligence collection can only help President Obama and his national security team to enhance their anti-ISIS strategy.

In fact, last year, the FBI used crowdsourcing to spot individuals who might be travelling abroad to join terrorist groups. But what if we asked the crowd of US citizens and residents to give us information specifically on where they’ve seen individuals get lured by ISIS in the country, as well as on specific recruitment strategies they may have noted? This might also lead to more real-time data points on ISIS defectors returning to the US – who are they, why did they defect and what can they tell us about their experience in Syria or Iraq? Overall, crowdsourcing such data (if verifiable) would quickly create a clearer picture of trends in recruitment and defectors across the country, which can only help the US enhance its anti-ISIS strategies.

This collaborative approach to data collection could also be used in Syria and Iraq with texts and online contributions from locals helping us to map ISIS’ movements….(More)”

Waze and the Traffic Panopticon


 in the New Yorker: “In April, during his second annual State of the City address, Los Angeles Mayor Eric Garcetti announced a data-sharing agreement with Waze, the Google-owned, Israel-based navigation service. Waze is different from most navigation apps, including Google Maps, in that it relies heavily on real-time, user-generated data. Some of this data is produced actively—a driver or passenger sees a stalled vehicle, then uses a voice command or taps a stalled-vehicle icon on the app to alert others—while other data, such as the user’s location and average speed, is gathered passively, via smartphones. The agreement will see the city provide Waze with some of the active data it collects, alerting drivers to road closures, construction, and parades, among other things. From Waze, the city will get real-time data on traffic and road conditions. Garcetti said that the partnership would mean “less congestion, better routing, and a more livable L.A.” Di-Ann Eisnor, Waze’s head of growth, acknowledged to me that these kinds of deals can cause discomfort to the people working inside city government. “It’s exciting, but people inside are also fearful because it seems like too much work, or it seems so unknown,” she said.

Indeed, the deal promises to help the city improve some of its traffic and infrastructure systems (L.A. still uses paper to manage pothole patching, for example), but it also acknowledges Waze’s role in the complex new reality of urban traffic planning. Traditionally, traffic management has been a largely top-down process. In Los Angeles, it is coördinated in a bunker downtown, several stories below the sidewalk, where engineers stare at blinking lights representing traffic and live camera feeds of street intersections. L.A.’s sensor-and-algorithm-driven Automated Traffic Surveillance and Control System is already one of the world’s most sophisticated traffic-mitigation tools, but it can only do so much to manage the city’s eternally unsophisticated gridlock. Los Angeles appears to see its partnership with Waze as an important step toward improving the bridge between its subterranean panopticon and the rest of the city still further, much like other metropolises that have struck deals with Waze under the company’s Connected Cities program.
Among the early adopters is Rio de Janeiro, whose urban command center tracks everything from accidents to hyperlocal weather conditions, pulling data from thirty departments and private companies, including Waze. “In Rio,” Eisnor said, traffic managers “were able to change the garbage routes, figure out where to install cameras, and deploy traffic personnel” because of the program. She also pointed out that Connected Cities has helped municipal workers in Washington, D.C., patch potholes within forty-eight hours of their being identified on Waze. “We’re helping reframe city planning through not just space but space and time,” she said…..(More)

India wants all government organizations to develop open APIs


Medianama: “The department of electronics and information technology (DeitY) is looking to frame a policy (pdf) for adopting and developing open application programming interfaces (APIs) in government organizations to promote software interoperability for all e-governance applications & systems. The policy shall be applicable to all central government organizations and to those state governments that choose to adopt the policy.

DeitY also said that all information and data of a government organisation shall be made available by open APIs, as per the National Data Sharing and Accessibility Policy and adhere to National Cyber Security Policy.

Policy points

– Each published API of a Government organization shall be provided free of charge whenever possible to other government organizations and public.

– Each published API shall be properly documented with sample code and sufficient information for developers to make use of the API.

– The life-cycle of the open API shall be made available by the API publishing Government organisation. The API shall be backward compatible with at least two earlier versions.

– Government organizations may use an authentication mechanism to enable service interoperability and single sign-on.

– All Open API systems built and data provided shall adhere to GoI security policies and guidelines.

…. This would allow anyone to build a website or an application and pull government information into the public domain. Everyone knows navigating a government website can be nightmarish. For example, Indian Railways provides open APIs which enabled the development of applications such as RailYatri. Through the eRail APIs, the application pulls info which includes list of stations, trains between stations, route of a train, Train Fares, PNR Status, Live train status, seat availability, cancelled, rescheduled or diverted train information and current running status of the train. …(More)”

See also “Policy on Open Application Programming Interfaces (APIs) for Government of India

Data Reinvents Libraries for the 21st Century


 in GovTech: “Libraries can evoke tired assumptions. It could be a stack of battered books and yesteryear movies; that odd odor of wilted pages and circa-1970s decor; or it could be a bout of stereotypes, like obsolete encyclopedias and ruler-snapping librarians.

Whatever the case, the truth is that today libraries are proving they’re more than mausoleums of old knowledge. They’re in a state of progressive reform, rethinking services and restructuring with data. It’s a national trend as libraries modernize, strategize and recast themselves as digital platforms. They’ve taken on the role of data curator for information coming in and citizen-generated data going out….

Nate Hill is among this band of progressives. As a data zealot who believes in data’s inclination for innovation, the former deputy director for Tennessee’s Chattanooga Public Library, led a charge to transform the library into a data centric community hub. The library boasts an open data portal that it manages for the city, a civic hacker lab, a makerspace for community projects, and expanded access to in-person and online tutorials for coding and other digital skill sets….

The draw in data sharing and creating, Hill said, comes from the realization that today’s data channels are no longer one-way systems.

“I push people to the idea that now it’s about being a producer rather than just a consumer,” Hill said, “because really that whole idea of a read-write Web comes from the notion that you and I, for example, are just as capable at editing Wikipedia articles on the fly and changing information as anybody else.”

For libraries, Hill sees this as an opportunity and asks what institution can better pioneer the new frontier of information exchange. He posits the idea that, as the original public content curator, adding open data to libraries is only natural. In fact, he says it’s a logical next step when considering that traditional media like books, research journals and other sources infuse data points with rich context — something most city and state open data portals typically don’t do.

“The dream here is to treat the library as a different kind of community infrastructure,” Hill said. “You can conceivably be feeding live data about a city into an open data portal, and at the same time, turning the library into a real live information source — rather than something just static.”

In Chattanooga, an ongoing effort is in the works to do just that. The library seeks to integrate open data into its library catalog searches. Visitors researching Chattanooga’s waterfront could do a quick search and pull up local books, articles and mapping documents, but also a collection of latest data sets on water pollution and land use, for example.

Eyeing the library data movement at scale, Hill said he could easily envision a network of public libraries that act as local data hubs, retrieving and funneling data into larger state and national data portals….(More).

Putting Open at the Heart of the Digital Age


Presentation by Rufus Pollock: “….To repeat then: technology is NOT teleology. The medium is NOT the message – and it’s the message that matters.

The printing press made possible an “open” bible but it was Tyndale who made it open – and it was the openness that mattered.

Digital technology gives us unprecedented potential for creativity, sharing, for freedom. But they are possible not inevitable. Technology alone does not make a choice for us.

Remember that we’ve been here before: the printing press was revolutionary but we still ended up with a print media that was often dominated by the few and the powerful.

Think of radio. If you read about how people talked about it in the 1910s and 1920s, it sounds like the way we used to talk about the Internet today. The radio was going to revolutionize human communications and society. It was going to enable a peer to peer world where everyone can broadcast, it was going to allow new forms of democracy and politics, etc. What happened? We got a one way medium, controlled by the state and a few huge corporations.

Look around you today.

The Internet’s costless transmission can – and is – just as easily creating information empires and information robber barons as it can creating digital democracy and information equality.

We already know that this technology offers unprecedented opportunities for surveillance, for monitoring, for tracking. It can just as easily exploit us as empower us.

We need to put openness at the heart of this information age, and at the heart of the Net, if we are really to realize its possibilities for freedom, empowerment, and connection.

The fight then is on the soul of this information age and we have a choice.

A choice of open versus closed.

Of collaboration versus control.

Of empowerment versus exploitation.

Its a long road ahead – longer perhaps than our lifetimes. But we can walk it together.

In this 21st century knowledge revolution, William Tyndale isn’t one person. It’s all of us, making small and big choices: from getting governments and private companies to release their data, to building open databases and infrastructures together, from choosing apps on your phone that are built on open to using social networks that give you control of your data rather than taking it from you.

Let’s choose openness, let’s choose freedom, let’s choose the infinite possibilities of this digital age by putting openness at its heart….(More)”- See also PowerPoint Presentation

Algorithmic Citizenship


Citizen-Ex: “Algorithmic Citizenship is a new form of citizenship, one where your citizenship, and therefore both your allegiances and your rights, are constantly being questioned, calculated, and rewritten.

Most people are assigned a citizenship at birth, in one of two ways. You may receive your citizenship from the place you’re born, which is called jus soli, or the right of soil. If you’re born in a place, that’s where you’re a citizen of. This is true in a lot of North and South America, for example – but not much of the rest of the world. You may get your citizenship based on where your parents are citizens of, which is called jus sanguinis, or the right of blood. Everybody is supposed to have a citizenship, although millions of stateless people do not, as a result of war, migration or the collapse of existing states. Many people also change citizenship over the course of their life, through various legal mechanisms. Some countries allow you to hold more than one citizenship at once, and some do not.

Having a citizenship means that you have a place in the world, an allegiance to a state. That state is supposed to guarantee you certain rights, like freedom from arrest, imprisonment, torture, or surveillance – depending on which state you belong to. Hannah Arendt famously said that “citizenship is the right to have rights”. To tamper with ones citizenship is to endanger ones most fundamental rights. Without citizenship, we have no rights at all.

Algorithmic Citizenship is a form of citizenship which is not assigned at birth, or through complex legal documents, but through data. Like other computerised processes, it can happen at the speed of light, and it can happen over and over again, constantly revising and recalculating. It can split a single citizenship into an infinite number of sub-citizenships, and count and weight them over time to produce combinations of affiliations to different states.

Citizen Ex calculates your Algorithmic Citizenship based on where you go online. Every site you visit is counted as evidence of your affiliation to a particular place, and added to your constantly revised Algorithmic Citizenship. Because the internet is everywhere, you can go anywhere – but because the internet is real, this also has consequences….(More)”