Dave Banisar at Article 19: “It is important to recognize the utility that data can bring. Data can ease analysis, reveal important patterns and facilitate comparisons. For example, the Transactional Access Clearing House (TRAC – http://www.trac.org) at Syracuse University uses data sets from the US Department of Justice to analyze how the federal government enforces its criminal and civil laws, showing how laws are applied differently across the US.
The (somewhat ICT-companies manufactured) excitement over “E-government” in the late 1990s imagined a brave new e-world where governments would quickly and easily provide needed information and services to their citizens. This was presented as an alternative to the “reactive” and “confrontational” right to information laws but eventually led to the realization that ministerial web pages and the ability to pay tickets online did not lead to open government. Singapore ranks near the top every year on e-government but is clearly not an ‘open government’. Similarly, it is important to recognize that governments providing data through voluntary measures is not enough.
For open data to promote open government, it needs to operate within a framework of law and regulation that ensures that information is collected, organized and stored and then made public in a timely, accurate and useful form. The information must be more than just what government bodies find useful to release, but what is important for the public to know to ensure that those bodies are accountable.
Otherwise, it is in danger of just being propaganda, subject to manipulation to make government bodies look good. TRAC has had to sue the USA federal government dozens of times under the Freedom of Information Act to obtain the government data and after they publish it, some government bodies still claim that the information is incorrect. Voluntary systems of publication usually fail when they potentially embarrass the bodies doing the publication.
In the countries where open data has been most successful such as the USA and UK, there also exists a legal right to demand information which keeps bodies honest. Most open government laws around the world now have requirements for affirmative publication of key information and they are slowly being amended to include open data requirements to ensure that the information is more easily usable.
Where there is no or weak open government laws, many barriers can obstruct open data. In Kenya, which has been championing their open data portal while being slow to adopt a law on freedom of information, a recent review found that the portal was stagnating. In part, the problem was that in the absence of laws mandating openness, there remains a culture of secrecy and fear of releasing information.
Further, mere access to data is not enough to ensure informed participation by citizens and enable their ability to affect decision-making processes. Legal rights to all information held by governments – right to information laws – are essential to tell the “why”. RTI reveals how and why decisions and policy are made – secret meetings, questionable contracts, dubious emails and other information. These are essential elements for oversight and accountability. Being able to document why a road was built for political reasons is as crucial for change as recognizing that it’s in the wrong place. The TRAC users, mostly journalists, use the system as a starting point to ask questions or why enforcement is so uneven or taxes are not being collected. They need sources and open government laws to ask these questions.
Of course, even open government laws are not enough. There needs to be strong rights for citizen consultation and participation and the ability to enforce those rights, such as is mandated by the UNECE Convention on Access to Environment Information, Public Participation and Access to Justice (Aarhus Convention). A protocol to that convention has led to a Europe-wide data portal on environmental pollution.
For open data to be truly effective, there needs to be a right to information enshrined in law that requires that information is made available in a timely, reliable format that people want, not just what the government body wants to release. And it needs to be backed up with rights of engagement and participation. From this open data can flourish. The OGP needs to refocus on the building blocks of open government – good law and policy – and not just the flashy apps.”
Beyond Transparency
New book on Open Data and the Future of Civic Innovation: The rise of open data in the public sector has sparked innovation, driven efficiency, and fueled economic development. And in the vein of high-profile federal initiatives like Data.gov and the White House’s Open Government Initiative, more and more local governments are making their foray into the field with Chief Data Officers, open data policies, and open data catalogs.
While still emerging, we are seeing evidence of the transformative potential of open data in shaping the future of our civic life. It’s at the local level that government most directly impacts the lives of residents—providing clean parks, fighting crime, or issuing permits to open a new business. This is where there is the biggest opportunity to use open data to reimagine the relationship between citizens and government.
Beyond Transparency is a cross-disciplinary survey of the open data landscape, in which practitioners share their own stories of what they’ve accomplished with open civic data. It seeks to move beyond the rhetoric of transparency for transparency’s sake and towards action and problem solving. Through these stories, we examine what is needed to build an ecosystem in which open data can become the raw materials to drive more effective decision-making and efficient service delivery, spur economic activity, and empower citizens to take an active role in improving their own communities….
This book is a resource for (and by) practitioners inside and outside government—from the municipal chief information officer to the community organizer to the civic-minded entrepreneur. Beyond Transparency is intended to capture and distill the community’s learnings around open data for the past four years. And we know that the community is going to continue learning. That’s why, in addition to the print version of the book which you can order on Amazon, we’ve also published the digital version of this book on this site under a Creative Commons license. The full text of this site is on GitHub — which means that anyone can submit a pull request with a suggested edit. Help us improve this resource for the community and write the next edition of Beyond Transparency by submitting your pull requests.
Code for America is a national nonprofit committed to building a government for the people, by the people, that works in the 21st century. Over the past four years, CfA has worked with dozens of cities to support civic innovation through open data. You can support this work by contributing to the book on GitHub, joining the CfA volunteer community (the Brigade), or connecting your city with CfA.
Google and NASA's Quantum Artificial Intelligence Lab
A peek at the early days of the Quantum AI Lab: a partnership between NASA, Google, and a 512-qubit D-Wave Two quantum computer.
Learn more at http://google.com/+QuantumAILab
GitHub and Government
New site: “Make government better, together. Stories of open source, open data, and open government.
This site is an open source effort to showcase best practices of open sourcing government. See something that you think could be better? Want to submit your own story? Simply fork the project and submit a pull request.
…
Ready to get started on GitHub? Here are some ideas that are easy to get your feet wet with.
Feedback Repository
GitHub’s about connecting with developers. Whether you’re an API publishing pro, or just getting started, creating a “feedback” repository can go a long way to connect your organization with the community. Get feedback from current and potential data consumers by creating a specific repository for them to contribute ideas and suggestions for types of data or other information they’d like to see opened. Here’s how:
- Create a new repository
- Choose your organization as the Owner
- Name the repository “feedback” or similar
- Click the checkbox to automatically create a
README.md
file
- Set up your Readme
- Click
README.md
within your newly created repository - Click
Edit
- Introduce yourself, describe why you’ve joined GitHub, what you’re hoping to do and what you’d like to learn from the development community. Encourage them to leave feedback through issues on the repository.
- Click
Sample text for your README.md
:
# City of Gotham Feedback
We've just joined GitHub and want to know what data would be interesting to our development community?
Leave us comments via issues!
Open source a Dataset
Open sourcing a dataset can be as simple as uploading a .csv
to GitHub and letting people know about it. Rather than publishing data as a zip file on your website or an FTP server, you can add the files through the GitHub.com web interface, or via the GitHub for Windows or GitHub for Mac native clients. Create a new repository to store your datasets – in many cases, it’s as easy as drag, drop, sync.
GitHub can host any file type (although open, non-binary files like .csv
s tend to work best). Plus, GitHub supports rendering certain open data formats interactively such as the popular geospacial .geojson
format. Once uploaded, citizens can view the files, and can even open issues or submit pull requests with proposed fixes.
Explore Open Source Civic Apps
There are many open source applications freely available on GitHub that were built just for government. Check them out, and see if it fits a need. Here are some examples:
- Adopt-a – This open source web app was created for the City of Boston in 2011 by Code for America fellows. It allows residents to “adopt” a hydrant and make sure it’s clear of snow in the winter so that emergency crews can locate them when needed. It has since been adopted in Chicago (for sidewalks), Seattle (for storm drains), and Honolulu (for tsunami sirens).
- StreetMix – Another creation of Code for America fellows (2013) this website, www.streetmix.net, allows anyone to create street sections in a way that is not only beautiful but educational, too. No downloading, no installing, no paying – make and save your creations right at the website. Great for internal or public community planning meetings.
- We The People – We The People, the White House’s petitions application hosted at petitions.whitehouse.gov is a Drupal module to allow citizens to submit and digitally sign petitions.
Open source something small
Chances are you’ve got something small you can open source. Check in with your web or new media team, and see if they’ve got something they’ve been dying to share or blog about, no matter how small. It can be snippet of analytics code, or maybe a small script used internally. It doesn’t even have to be code.
Post your website’s privacy policy, comment moderation policy, or terms of service and let the community weigh in before your next edit. No matter how small it is, getting your first open source project going is a great first step.
Improve an existing project
Does you agency use an existing open source project to conduct its own business? Open an issue on the project’s repository with a feature request or a bug you spot. Better yet, fork the project, and submit your improvements. Even if it’s one or two lines of code, such examples are great to blog about to showcase your efforts.
Don’t forget, this site is an open source project, too. Making an needed edit is another great way to get started.”
And Data for All: On the Validity and Usefulness of Open Government Data
Paper presented at the the 13th International Conference on Knowledge Management and Knowledge Technologies: “Open Government Data (OGD) stands for a relatively young trend to make data that is collected and maintained by state authorities available for the public. Although various Austrian OGD initiatives have been started in the last few years, less is known about the validity and the usefulness of the data offered. Based on the data-set on Vienna’s stock of trees, we address two questions in this paper. First of all, we examine the quality of the data by validating it according to knowledge from a related discipline. It shows that the data-set we used correlates with findings from meteorology. Then, we explore the usefulness and exploitability of OGD by describing a concrete scenario in which this data-set can be supportive for citizens in their everyday life and by discussing further application areas in which OGD can be beneficial for different stakeholders and even commercially used.”
Choose Your Own Route on Finland's Algorithm-Driven Public Bus
Brian Merchant at Motherboard: “Technology should probably be transforming public transit a lot faster than it is. Yes, apps like Hopstop have made finding stops easier and I’ve started riding the bus in unfamiliar parts of town a bit more often thanks to Google Maps’ route info. But these are relatively small steps, and it’s all limited to making scheduling information more widely available. Where’s the innovation on the other side? Where’s the Uber-like interactivity, the bus that comes to you after a tap on the iPhone?
In Finland, actually. The Kutsuplus is Helsinki’s groundbreaking mass transit hybrid program that lets riders choose their own routes, pay for fares on their phones, and summon their own buses. It’s a pretty interesting concept. With a ten minute lead time, you summon a Kutsuplus bus to a stop using the official app, just as you’d call a livery cab on Uber. Each minibus in the fleet seats at least nine people, and there’s room for baby carriages and bikes.
You can call your own private Kutsuplus, but if you share the ride, you share the costs—it’s about half the price of a cab fare, and a dollar or two more expensive than old school bus transit. You can then pick your own stop, also using the app.
The interesting part is the scheduling, which is entirely automated. If you’re sharing the ride, an algorithm determines the most direct route, and you only get charged as though you were riding solo. You can pay with a Kutsuplus wallet on the app, or, eventually, bill the charge to your phone bill.”
Seven Principles for Big Data and Resilience Projects
PopTech & Rockefeler Bellagio Fellows: “The following is a draft “Code of Conduct” that seeks to provide guidance on best practices for resilience building projects that leverage Big Data and Advanced Computing. These seven core principles serve to guide data projects to ensure they are socially just, encourage local wealth- & skill-creation, require informed consent, and be maintainable over long timeframes. This document is a work in progress, so we very much welcome feedback. Our aim is not to enforce these principles on others but rather to hold ourselves accountable and in the process encourage others to do the same. Initial versions of this draft were written during the 2013 PopTech & Rockefeller Foundation workshop in Bellagio, August 2013.
Open Source Data Tools – Wherever possible, data analytics and manipulation tools should be open source, architecture independent and broadly prevalent (R, python, etc.). Open source, hackable tools are generative, and building generative capacity is an important element of resilience….
Transparent Data Infrastructure – Infrastructure for data collection and storage should operate based on transparent standards to maximize the number of users that can interact with the infrastructure. Data infrastructure should strive for built-in documentation, be extensive and provide easy access. Data is only as useful to the data scientist as her/his understanding of its collection is correct…
Develop and Maintain Local Skills – Make “Data Literacy” more widespread. Leverage local data labor and build on existing skills. The key and most constraint ingredient to effective data solutions remains human skill/knowledge and needs to be retained locally. In doing so, consider cultural issues and language. Catalyze the next generation of data scientists and generate new required skills in the cities where the data is being collected…
Local Data Ownership – Use Creative Commons and licenses that state that data is not to be used for commercial purposes. The community directly owns the data it generates, along with the learning algorithms (machine learning classifiers) and derivatives. Strong data protection protocols need to be in place to protect identities and personally identifying information…
Ethical Data Sharing – Adopt existing data sharing protocols like the ICRC’s (2013). Permission for sharing is essential. How the data will be used should be clearly articulated. An opt in approach should be the preference wherever possible, and the ability for individuals to remove themselves from a data set after it has been collected must always be an option. Projects should always explicitly state which third parties will get access to data, if any, so that it is clear who will be able to access and use the data…
Right Not To Be Sensed – Local communities have a right not to be sensed. Large scale city sensing projects must have a clear framework for how people are able to be involved or choose not to participate. All too often, sensing projects are established without any ethical framework or any commitment to informed consent. It is essential that the collection of any sensitive data, from social and mobile data to video and photographic records of houses, streets and individuals, is done with full public knowledge, community discussion, and the ability to opt out…
Learning from Mistakes – Big Data and Resilience projects need to be open to face, report, and discuss failures. Big Data technology is still very much in a learning phase. Failure and the learning and insights resulting from it should be accepted and appreciated. Without admitting what does not work we are not learning effectively as a community. Quality control and assessment for data-driven solutions is notably harder than comparable efforts in other technology fields. The uncertainty about quality of the solution is created by the uncertainty inherent in data…”
Five Ways to Make Government Procurement Better
Mark Headd at Civic Innovations: “Nothing in recent memory has focused attention on the need for wholesale reform of the government IT procurement system more than the troubled launch of healthcare.gov.
There has been a myriad of blog posts, stories and articles written in the last few weeks detailing all of the problems that led to the ignominious launch of the website meant to allow people to sign up for health care coverage.
Though the details of this high profile flop are in the latest headlines, the underlying cause has been talked about many times before – the process by which governments contract with outside parties to obtain IT services is broken…
With all of this in mind, here are – in no particular order – five suggested changes that can be adopted to improve the government procurement process.
Raise the threshold on simplified / streamlined procurement
Many governments use a separate, more streamlined process for smaller projects that do not require a full RFP (in the City of Philadelphia, professional services projects that do not exceed $32,000 annually go through this more streamlined bidding process). In Philadelphia, we’ve had great success in using these smaller projects to test new ideas and strategies for partnering with IT vendors. There is much we can learn from these experiments, and a modest increase to enable more experimentation would allow governments to gain valuable new insights.
Narrowing the focus of any enhanced thresholds for streamlined budding to web-based projects would help mitigate risk and foster a quicker process for testing new ideas.
Identify clear standards for projects
Having a clear set of vendor-agnostic IT standards to use when developing RFPs and in performing work can make a huge difference in how a project turns out. Clearly articulating standards for:
- The various components that a system will use.
- The environment in which it will be housed.
- The testing it must undergo prior to final acceptance.
…can go a long way to reduce the risk an uncertainly inherent in IT projects.
It’s worth noting that most governments probably already have a set of IT standards that are usually made part of any IT solicitation. But these standards documents can quickly become out of date – they must undergo constant review and refinement. In addition, many of the people writing these standards may confuse a specific vendor product or platform with a true standard.
Require open source
Requiring that IT projects be open source during development or after completion can be an effective way to reduce risk on an IT project and enhance transparency. This is particularly true of web-based projects.
In addition, government RFPs should encourage the use of existing open source tools – leveraging existing software components that are in use in similar projects and maintained by an active community – to foster external participation by vendors and volunteers alike. When governments make the code behind their project open source, they enable anyone that understands software development to help make them better.
Develop a more robust internal capacity for IT project management and implementation
Governments must find ways to develop the internal capacity for developing, implementing and managing technology projects.
Part of the reason that governments make use of a variety of different risk mitigation provisions in public bidding is that there is a lack of people in government with hands on experience building or maintaining technology. There is a dearth of makers in government, and there is a direct relationship between the perceived risk that governments take on with new technology projects and the lack of experienced technologists working in government.
Governments need to find ways to develop a maker culture within their workforces and should prioritize recruitment from the local technology and civic hacking communities.
Make contracting, lobbying and campaign contribution data public as open data
One of the more disheartening revelations to come out of the analysis of healthcare.gov implementation is that some of the firms that were awarded work as part of the project also spent non-trivial amounts of money on lobbying. It’s a good bet that this kind of thing also happens at the state and local level as well.
This can seriously undermine confidence in the bidding process, and may cause many smaller firms – who lack funds or interest in lobbying elected officials – to simply throw up their hands and walk away.
In the absence of statutory or regulatory changes to prevent this from happening, governments can enhance the transparency around the bidding process by working to ensure that all contracting data as well as data listing publicly registered lobbyists and contributions to political campaigns is open.
Ensuring that all prospective participants in the public bidding process have confidence that the process will be fair and transparent is essential to getting as many firms to participate as possible – including small firms more adept at agile software development methodologies. More bids typically equates to higher quality proposals and lower prices.
None of the changes list above will be easy, and governments are positioned differently in how well they may achieve any one of them. Nor do they represent the entire universe of things we can do to improve the system in the near term – these are items that I personally think are important and very achievable.
One thing that could help speed the adoption of these and other changes is the development of robust communication framework between government contracting and IT professionals in different cities and different states. I think a “Municipal Procurement Academy” could go a long way toward achieving this.”
Connecting Grassroots and Government for Disaster Response
New Report by John Crowley for the Wilson Center: “Leaders in disaster response are finding it necessary to adapt to a new reality. Although community actions have always been the core of the recovery process, collective action from the grassroots has changed response operations in ways that few would have predicted. Using new tools that interconnect over expanding mobile networks, citizens can exchange information via maps and social media, then mobilize thousands of people to collect, analyze, and act on that information. Sometimes, community-sourced intelligence may be fresher and more accurate than the information given to the responders who provide aid…
Also see the companion report from our September 2012 workshop, written by Ryan Burns and Lea Shanley, as well as a series of videos from the workshop and podcasts with workshop participants.”
Sir Tim Berners-Lee: The many meanings of Open
Sir Tim Berners-Lee; ” I was recently asked to talk about the idea of “open”, and I realized the term is used in at least eight different ways. The distinct interpretations are all important in different but interlocking ways. Getting them confused leads to a lot of misunderstanding, so it’s good to review them all.
When we tease apart their meanings, we can understand more clearly which aspects of each are the most important. The first, one of the most important forms of openness for the Web, is its universality.
Universality – When I designed the Web protocols, I had already seen many networked information systems fail because they made some assumptions about the users – that they were using a particular type of computer for instance – or constrained the way they worked, such as forcing them to organize their data in a particular way, or to use a particular data format. The Web had to avoid these issues. The goal was that anyone should be able to publish anything on the Web and so it had to be universal in that it was independent of all these technical constraints, as well as language, character sets, and culture….
Open Standards
The actual design of the Web involved the creation of open standards – and getting people to agree to use them globally. The World Wide Web Consortium (W3C), of which I am the Director, helps create interoperable standards for Web technology, including HTML5, mobile Web, graphics, the Semantic Web of linked data, and Web accessibility. Any company can join and anyone can review and help create the specifications for the Web….
Open Web Platform (OWP)
W3C’s Open Web Platform is the name for a particular set of open standards which enable an exciting stage of Web computing. Standards such as HTML5, SVG, CSS, video, JavaScript, and others are advancing together so that programmes that once worked only on desktop, tablets or phones can now work from within the browser itself. It has all the power of HTML5, like easily-inserted video and, in the future, easily-inserted conferences. It also features the APIs for accessing hardware and other capabilities on the device, such as a smartphone’s accelerometer, camera, and local storage. While native apps are limited, Web Apps can work on any platform….
Open Government through Open Data
In 2009, I resolved to encourage more use of data on the Web. Too many websites could generate nice reports as documents, but had no way to access the data behind it to check and build on the results. In February that year I stood up in front of a TED audience and asked them for their data; I even got them to chant: “raw data now”. In April that year, I met with Gordon Brown, then Prime Minister of the UK and with him began the UK Government’s ground-breaking work on Open Data. That same year President Barack Obama announced his commitment to the US Open Government Initiative. In 2010 I went back to TED and showed the audience some of what had been achieved, including Open Street Map’s role in relief efforts in Haiti….
Open Platform
While it’s not really a feature of the Web, a concern for a lot of people is whether they can choose which apps run on their own phone or computer. An Open Platform means having the right to install and write software on your computer or device. One motivation to close off a computing platform comes from a manufacturer wanting to allow you to experience their content on your machine without being able to store it or pass it on. Some systems are very closed, in that the user can only watch a movie or play a game, with no chance to copy anything or back it up. Some systems are very open, allowing users to take copies of files and run any application they like. Many systems fall in between, letting users pay for additional material or an experience…
Open Source
“Open Source” is another way “open” is used on the web, one which has been and is very important to the Web’s growth. It’s important to me that I can get at the source code of any software I’m using. If I can get at the source code, can I modify it? Can I distribute the modified code and run it on my machine? As Free Software Foundation lead Richard Stallman puts it, “free as in freedom rather than free as in beer”.
Open Access
Open Access is a Web-based movement specifically about free (as in beer) access to the body of academic learning. Governments, and therefore taxpayers, pay for research via grants but often the results of the research are kept in closed-access academic journals. The results are only available to those at big universities. The poor and those in remote rural areas cannot participate…
Open Internet and Net Neutrality
When we talk about keeping the internet free and open, we are often worried about blocking and spying. One of the ways in which we protect the Web is by ensuring Net Neutrality. Net Neutrality is about non-discrimination. Its principle is that if I pay to connect to the Net with a certain quality of service, and you pay to connect with that or a greater quality of service, then we can both communicate at the same level. This is important because it allows an open, fair market. It’s essential to an open, fair democracy. The alternative is a Web in which governments or large companies, or frequently a close association of the two, try to control the internet, with packets of information delivered in a way that discriminates for commercial or political reasons. Regimes of every sort spy on their citizens, deriving hugely accurate and detailed profiles of them and their intimate lives. Today, the battle is building. The rights of individual people on the Web are being attacked, and at the moment only a few people really understand and realize what is going on.”