“This Prezi by César Nicandro Cruz-Rubiois designed for educational purposes. It presents open government concept and uses some Youtube source videos in order to give some examples”
Developing an open government plan in the open
The development of an OGP National Action Plan, therefore, presents a twofold opportunity for opening up government: On the one hand it should be used to deliver a set of robust and ambitious commitments to greater transparency, participation and accountability. But just as importantly, the process of developing a NAP should also be used to model new forms of open and collaborative working within government and civil society. These two purposes of a NAP should be mutually reinforcing. An open and collaborative process can – as was the case in the UK – help to deliver a more robust and ambitious action plan, which in turn can demonstrate the efficacy of working in the open.
You could even go one step further to say that the development of an National Action Plan should present an (almost) “ideal” vision of what open government in a country could look like. If governments aren’t being open as they’re developing an open government action plan, then there’s arguably little hope that they’ll be open elsewhere.
As coordinators of the UK OGP civil society network, this was on our mind at the beginning and throughout the development of the UK’s 2013-15 National Action Plan. Crucially, it was also on the minds of our counterparts in the UK Government. From the start, therefore, the process was developed with the intention that it should itself model the principles of open government. Members of the UK OGP civil society network met with policy officials from the UK Government on a regular basis to scope out and develop the action plan, and we published regular updates of our discussions and progress for others to follow and engage with. The process wasn’t without its challenges – and there’s still much more we can do to open it up further in the future – but it was successful in moving far beyond the typical model of government deciding, announcing and defending its intentions and in delivering an action plan with some strong and ambitious commitments.
One of the benefits of working in an open and collaborative way is that it enabled us to conduct and publish a full – warts and all – review of what went well and what didn’t. So, consider this is an invitation to delve into our successes and failures, a challenge to do it better and a request to help us to do so too. Head over to the UK OGP civil society network blog to read about what we did, and tell us what you think: http://www.opengovernment.org.uk/national-action-plan/story-of-the-uk-national-action-plan-2013-15/”
Canadian Organizations Join Forces to Launch Open Data Institute to Foster Open Government
The Open Data Institute, which received support from the Government of Canada in this week’s budget, will work with governments, academic institutions and the private sector to solve challenges facing “open government” efforts and realize the full potential of “open data.”
According to a statement, partners will work on development of common standards, the integration of data from different levels of government and the commercialization of data, “allowing Canadians to derive greater economic benefit from datasets that are made available by all levels of government.”
The Open Data Institute is a public-private partnership. Founding partners will contribute $3 million in cash and in-kind contributions over three years to establish the institute, a figure that has been matched by the Government of Canada.
“This is a strategic investment in Canada’s ability to lead the digital economy,” said Kevin Tuer, Managing Director of CDMN. “Similar to how a common system of telephone exchanges allowed world-wide communication, the Open Data Institute will help create a common platform to share and access datasets.”
“This will allow the development of new applications and products, creating new business opportunities and jobs across the country,” he added.
“The Institute will serve as a common forum for government, academia and the private sector to collaborate on Open Government initiatives with the goal of fueling Canadian tech innovation,” noted OpenText President and CEO Mark J. Barrenechea
“The Open Data Institute has the potential to strengthen the regional economy and increase our innovative capacity,” added Feridun Hamdullahpur, president and vice-chancellor of the University of Waterloo.
House Bill Raises Questions about Crowdsourcing
Anne Bowser for Commons Lab (Wilson Center):”A new bill in the House is raising some key questions about how crowdsourcing is understood by scientists, government agencies, policymakers and the public at large.
Robin Bravender’s recent article in Environment & Energy Daily, “House Republicans Push Crowdsourcing on Agency Science,” (subscription required) neatly summarizes the debate around H.R. 4012, a bill introduced to the House of Representatives earlier this month. The House Science, Space and Technology Committe earlier this week held a hearing on the bill, which could see a committee vote as early as next month.
Dubbed the “Secret Science Reform Act of 2014,” the bill prohibits the Environmental Protection Agency (EPA) from “proposing, finalizing, or disseminating regulations or assessments based upon science that is not transparent or reproducible.” If the bill is passed, EPA would be unable to base assessments or regulations on any information not “publicly available in a manner that is sufficient for independent analysis.” This would include all information published in scholarly journals based on data that is not available as open source.
The bill is based on the premise that forcing EPA to use public data will inspire greater transparency by allowing “the crowd” to conduct independent analysis and interpretation. While the premise of involving the public in scientific research is sound, this characterization of crowdsourcing as a process separate from traditional scientific research is deeply problematic.
This division contrasts the current practices of many researchers, who use crowdsourcing to directly involve the public in scientific processes. Galaxy Zoo, for example, enlists digital volunteers (called “citizen scientists”) help classify more than 40 million photographs of galaxies taken by the Hubble Telescope. These crowdsourced morphological classifications are a powerful form of data analysis, a key aspect of the scientific process. Galaxy Zoo then publishes a catalogue of these classifications as an open-source data set. And the data reduction techniques and measures of confidence and bias for the data catalogue are documented in MNRAS, a peer-reviewed journal. A recent Google Scholar search shows that the data set published in MNRAS has been cited a remarkable 121 times.
As this example illustrates, crowdsourcing is often embedded in the process of formal scientific research. But prior to being published in a scientific journal, the crowdsourced contributions of non-professional volunteers are subject to the scrutiny of professional scientists through the rigorous process of peer review. Because peer review was designed as an institution to ensure objective and unbiased research, peer-reviewed scientific work is widely accepted as the best source of information for any science-based decision.
Separating crowdsourcing from the peer review process, as this legislation intends, means that there will be no formal filters in place to ensure that open data will not be abused by special interests. Ellen Silbergeld, a professor at John Hopkins University who testified at the hearing this week, made exactly this point when she pointed to data manipulation commonly practiced by tobacco lobbyists in the United States.
Contributing to scientific research is one goal of crowdsourcing for science. Involving the public in scientific research also increases volunteer understanding of research topics and the scientific process and inspires heightened community engagement. These goals are supported by President Obama’s Second Open Government National Action Plan, which calls for “increased crowdsourcing and citizen science programs” to support “an informed and active citizenry.” But H.R. 4012 does not support these goals. Rather, this legislation could further degrade the public’s understanding of science by encouraging the public to distrust professional scientists rather than collaborate with them.
Crowdsourcing benefits organizations by bringing in the unique expertise held by external volunteers, which can augment and enhance the traditional scientific process. In return, these volunteers benefit from exposure to new and exciting processes, such as scientific research. This mutually beneficial relationship depends on collaboration, not opposition. Supporting an antagonistic relationship between science-based organizations like the EPA and members of “the crowd” will benefit neither institutions, nor volunteers, nor the country as a whole.“
What makes a good API?
Joshua Tauberer’s Blog: “There comes a time in every dataset’s life when it wants to become an API. That might be because of consumer demand or an executive order. How are you going to make a good one?…
Let’s take the common case where you have a relatively static, large dataset that you want to provide read-only access to. Here are 19 common attributes of good APIs for this situation. …
Granular Access. If the user wanted the whole thing they’d download it in bulk, so an API must be good at providing access to the most granular level practical for data users (h/t Ben Balter for the wording on that). When the data comes from a table, this usually means the ability to read a small slice of it using filters, sorting, and paging (limit/offset), the ability to get a single row by identifying it with a persistent, unique identifier (usually a numeric ID), and the ability to select just which fields should be included in the result output (good for optimizing bandwidth in mobile apps, h/t Eric Mill). (But see “intents” below.)
Deep Filtering. An API should be good at needle-in-haystack problems. Full text search is hard to do, so an API that can do it relieves a big burden for developers — if your API has any big text fields. Filters that can span relations or cross tables (i.e. joins) can be very helpful as well. But don’t go overboard. (Again, see “intents” below.)
Typed Values. Response data should be typed. That means that whether a field’s value is an integer, text, list, floating-point number, dictionary, null, or date should be encoded as a part of the value itself. JSON and XML with XSD are good at this. CSV and plain XML, on the other hand, are totally untyped. Types must be strictly enforced. Columns must choose a data type and stick with it, no exceptions. When encoding other sorts of data as text, the values must all absolutely be valid according to the most narrow regular expression that you can make. Provide that regular expression to the API users in documentation.
Normalize Tables, Then Denormalize. Normalization is the process of removing redundancy from tables by making multiple tables. You should do that. Have lots of primary keys that link related tables together. But… then… denormalize. The bottleneck of most APIs isn’t disk space but speed. Queries over denormalized tables are much faster than writing queries with JOINs over multiple tables. It’s faster to get data if it’s all in one response than if the user has to issue multiple API calls (across multiple tables) to get it. You still have to normalize first, though. Denormalized data is hard to understand and hard to maintain.
Be RESTful, And More. ”REST” is a set of practices. There are whole books on this. Here it is in short. Every object named in the data (often that’s the rows of the table) gets its own URL. Hierarchical relationships in the data are turned into nice URL paths with slashes. Put the URLs of related resources in output too (HATEOAS, h/t Ed Summers). Use HTTP GET and normal query string processing (a=x&b=y) for filtering, sorting, and paging. The idea of REST is that these are patterns already familiar to developers, and reusing existing patterns — rather than making up entirely new ones — makes the API more understandable and reusable. Also, use HTTPS for everything (h/t Eric Mill), and provide the API’s status as an API itself possibly at the root URL of the API’s URL space (h/t Eric Mill again).
….
Never Require Registration. Don’t have authentication on your API to keep people out! In fact, having a requirement of registration may contradict other guidelines (such as the 8 Principles of Open Government Data). If you do use an API key, make it optional. A non-authenticated tier lets developers quickly test the waters, and that is really important for getting developers in the door, and, again, it may be important for policy reasons as well. You can have a carrot to incentivize voluntary authentication: raise the rate limit for authenticated queries, for instance. (h/t Ben Balter)
Interactive Documentation. An API explorer is a web page that users can visit to learn how to build API queries and see results for test queries in real time. It’s an interactive browser tool, like interactive documentation. Relatedly, an “explain mode” in queries, which instead of returning results says what the query was and how it would be processed, can help developers understand how to use the API (h/t Eric Mill).
Developer Community. Life is hard. Coding is hard. The subject matter your data is about is probably very complex. Don’t make your API users wade into your API alone. Bring the users together, bring them to you, and sometimes go to them. Let them ask questions and report issues in a public place (such as github). You may find that users will answer other users’ questions. Wouldn’t that be great? Have a mailing list for longer questions and discussion about the future of the API. Gather case studies of how people are using the API and show them off to the other users. It’s not a requirement that the API owner participates heavily in the developer community — just having a hub is very helpful — but of course the more participation the better.
Create Virtuous Cycles. Create an environment around the API that make the data and API stronger. For instance, other individuals within your organization who need the data should go through the public API to the greatest extent possible. Those users are experts and will help you make a better API, once they realize they benefit from it too. Create a feedback loop around the data, meaning find a way for API users to submit reports of data errors and have a process to carry out data updates, if applicable and possible. Do this in the public as much as possible so that others see they can also join the virtuous cycle.”
AskThem.io – Questions-and-Answers with Every Elected Official
Press Release: “AskThem.io, launching Feb. 10th, is a free & open-source website for questions-and-answers with public figures. AskThem is like a version of the White House’s “We The People” petition platform, where over 8 million people have taken action to support questions for a public response – but for the first time, for every elected official nationwide…AskThem.io has official government data for over 142,000 U.S. elected officials at every level of government: federal, state, county, and municipal. Also, AskThem allows anyone to ask a question to any verified Twitter account, for online dialogue with public figures.
Here’s how AskThem works for online public dialogue:
- For the first time in an open-source website, visitors enter their street address to see all their elected officials, from federal down to the city levels, or search for a verified Twitter account.
- Individuals & organizations submit a question to their elected officials – for example, asking a city council member about a proposed ban on plastic bags.
- People then sign on to the questions and petitions they support, voting them up on AskThem and sharing them over social media, as with online petitions.
- When a question passes a certain threshold of signatures, AskThem delivers it to the recipient over email & social media and encourages a public response – creating a continual, structured dialogue with elected officials at every level of government.
AskThem also incorporates open government data, such as city council agendas and key vote information, to inform good questions of people in power. Open government advocate, Chicago, IL Clerk Susana Mendoza, joined AskThem because she believes that “technology should bring residents and the Office of the Chicago City Clerk closer together.”
Elected officials who sign up with AskThem agree to respond to the most popular questions from their constituents (about two per month). Interested elected officials can sign up now to become verified, free & open to everyone.
Issue-based organizations can use question & petition info from AskThem to surface political issues in their area that people care about, stay continuously engaged with government, and promote public accountability. Participating groups on AskThem include the internet freedom non-profit Fight For the Future, the social media crowd-speaking platform Thunderclap.it, the Roosevelt Institute National Student Network, and more.”
Citizen Engagement: 3 Cities And Their Civic Tech Tools
Melissa Jun Rowley at the Toolbox: “Though democratic governments are of the people, by the people, and for the people, it often seems that our only input is electing officials who pass laws on our behalf. After all, I don’t know many people who attend town hall meetings these days. But the evolution of technology has given citizens a new way to participate. Governments are using technology to include as many voices from their communities as possible in civic decisions and activities. Here are three examples.
Raleigh, NC
Raleigh North Carolina’s open government initiative is a great example of passive citizen engagement. By following an open source strategy, Open Raleigh has made city data available to the public. Citizens then use the data in a myriad of ways, from simply visualizing daily crime in their city, to creating an app that lets users navigate and interactively utilize the city’s greenway system.
Fort Smith, AR
Using MindMixer, Fort Smith Arkansas has created an online forum for residents to discuss the city’s comprehensive plan, effectively putting the community’s future in the hands of the community itself. Citizens are invited to share their own ideas, vote on ideas submitted by others, and engage with city officials that are “listening” to the conversation on the site.
Seattle, WA
Being a tech town, it’s no surprise that Seattle is using social media as a citizen engagement tool. The Seattle Police Department (SPD) uses a variety of social media tools to reach the public. In 2012, the department launched a first-of-its kind hyper-local twitter initiative. A police scanner for the twitter generation, Tweets by Beat provides twitter feeds of police dispatches in each of Seattle’s 51 police beats so that residents can find out what is happening right on their block.
In addition to Twitter and Facebook, SPD created a Tumblr to, in their own words, “show you your police department doing police-y things in your city.” In a nutshell, the department’s Tumblr serves as an extension of their other social media outlets. “
Civic Works Project translates data into community tools
The blog of the John S. and James L. Knight Foundation:”The Civic Works Project is a two-year effort to create apps and other tools to help increase the utility of local government data to benefit community organizations and the broader public. w
This project looks systemically at public and private information that can be used to engage residents, solve community problems and increase government accountability. We believe that there is a new frontier where information can be used to improve public services and community building efforts that benefit local residents.
Through the Civic Works Project, we’re seeking to improve access to information and identify solutions to problems facing diverse communities. Uncovering the value of data—and the stories behind it—can enhance the provision of public services through the smart application of technology.
Here’s some of what we’ve accomplished.
Partnership with WBEZ Public Data Blog
The WBEZ Public Data Blog is dedicated to examining and promoting civic data in Chicago, Cook County and Illinois. WBEZ is partnering with the Smart Chicago Collaborative to provide news and analysis on open government by producing content items that explain and tell stories hidden in public data. The project seeks to increase the utility, understanding, awareness and availability of local civic data. It comprises blog postings on the hidden uses of data and stories from the data, while including diverse voices and discussions on how innovations can improve civic life. It also features interviews with community organizations, businesses, government leaders and residents on challenges that could be solved through more effective use of public data.
Crime and Punishment in Chicago
The Crime and Punishment in Chicago project will provide an index of data sources regarding the criminal justice system in Chicago. This site will aggregate sources of data, how this data is generated, how to get it and what data is unavailable.
Illinois OpenTech Challenge
The Illinois Open Technology Challenge aims to bring governments, developers and communities together to create digital tools that use public data to serve today’s civic needs and promote economic development. Smart Chicago and our partners worked with government officials to publish 138 new datasets (34 in Champaign, 15 in Rockford, 12 in Belleville, and 77 from the 42 municipalities in the South Suburban Mayors and Managers Association) on the State of Illinois data portal. Smart Chicago has worked with developers in meet-ups all over the state—in six locations in four cities with 149 people. The project has also allowed Smart Chicago to conduct outreach in each of our communities to reach regular residents with needs that can be addressed through data and technology.
LocalData + SWOP
The LocalData + SWOP project is part of our effort to help bridge technology gaps in high-capacity organizations. This effort helps the Southwest Organizing Project collect information about vacant and abandoned housing using the LocalData tool.
Affordable Care Act Outreach App
With the ongoing implementation of the Affordable Care Act, community organizations such as LISC-Chicago have been hard at work providing navigators to help residents register through the healthcare.gov site.
Currently, LISC-Chicago organizers are in neighborhoods contacting residents and encouraging them to go to their closest Center for Working Families. Using a combination of software, such as Wufoo and Twilio, Smart Chicago is helping LISC with its outreach by building a tool that enables organizers to send text reminders to sign up for health insurance to residents.
Texting Tools: Twilio and Textizen
Smart Chicago is expanding the Affordable Care Act outreach project to engage residents in other ways using SMS messaging.
Smart Chicago is also a local provider for Textizen, an SMS-based survey tool that civic organizations can use to obtain resident feedback. Organizations can create a survey campaign and then place the survey options on posters, postcards or screens during live events. They can then receive real-time feedback as people text in their answers.
WikiChicago
WikiChicago will be a hyper-local Wikipedia-like website that anyone can edit. For this project, Smart Chicago is partnering with the Chicago Public Library to feature local authors and books about Chicago, and to publish more information about Chicago’s rich history.”
Building Transparency Momentum
Aspen Baker in the Stanford Social Innovation Review: “Even engaged citizens in Oakland, Calif., didn’t know the city had a Public Ethics Commission, let alone what its purpose was, when I joined its ranks three years ago. And people who did know about it didn’t have many nice things to say: Local blogs sneered at its lack of power and few politicians feared its oversight. Created in 1996 as a watchdog organization responsible for opening up city government, the commission had become just another element of Oakland’s cumbersome, opaque bureaucracy.
It’s easy to see why. Technology and media have dramatically changed our expectations for what defines transparency and accountability. For example, in the past, walking into City Hall, making an official request for a public record, and receiving it in the mail within two weeks meant good, open government. Now, if an Internet search doesn’t instantly turn up an answer to your question about local government, the assumption often is: Government’s hiding something.
This is rarely the case. Consider that Oakland has more than 40,000 boxes full of paper documents housed in locations throughout the city, not to mention hundreds of thousands of email messages generated each year. Records management is a serious—and legal—issue, and it’s fallen way behind the times. In an age when local municipalities are financially stretched more than ever before (38 US cities have declared bankruptcy since 2010), the ability of cities to invest in the technology, systems, and staff—and to facilitate the culture change that cities often need—is a real, major challenge.
Yet, for the innovators, activists, and leaders within and outside city government, this difficult moment is also one of significant opportunity for change; and many are seizing it.
Last month, the Transparency Project of the Public Ethics Commission—a subcommittee that I initiated and have led as chair for the last year—released a report detailing just how far Oakland has come and how far we have to go to create a culture of innovation, accountability, and transparency throughout all levels of the city.
Collaboration Is Critical
What comes through the report loud and clear is the important role that collaboration between city staff, the community, nonprofits, and others played in shifting expectations and establishing new standards—including the momentum generated by the volunteer-led “City Camps,” a gathering of citizens, city government, and businesses to work on open government issues, and the recent launch of RecordTrac, an online public records request tracking system built by Code for America Fellows that departments throughout the city have successfully adopted. RecordTrac makes information available to everyone, not just the person who requested it.
Ideas and Experiments Matter
Innovators didn’t let financial circumstances get in the way of thinking about what even a cash-strapped, problem-plagued city like Oakland could do to meet the new expectations of its citizens to find information quickly and easily online. The commission’s executive director Whitney Barazoto, along with her collaborators, didn’t think “small and practical”; they chose “big and futuristic” instead. Most importantly, they sought to experiment with new ways of spreading ideas and engaging the public in discussions—far beyond the standard (and often ineffective) “three minutes at the mic” practice at public meetings….
The “Toward Collective Transparency” report details the history of the innovative efforts to increase transparency within the City of Oakland and offers a number of recommendations for what’s next. The most defining feature of this report is its acknowledgment of the significant cultural changes that are taking place within the city, and around the country, in the way we think about the role of government, citizens, and the type of engagement and collaboration that can—and should—exist between the two.
It’s easy to get caught up in what’s gone wrong, but our subcommittee made a choice early on not to get buried in the past. We capitalized on our commission’s strengths rather than our weaknesses, leaving “deficit thinking” behind so that we could think creatively about what the commission and city were uniquely positioned to do.
Why does all this matter?
Last year, John Bridgeland and Peter Orszag, former officials in the administrations of President Obama and President George W. Bush, wrote an article in The Atlantic titled, “Can Government Play Moneyball?” They pointed out the need to measure the impact of government spending using the evidence-based statistical approach that the Oakland A’s own manager, Billy Beane, made famous. They argued that the same kind of scarcity Billy faced building a competitive baseball team is not unlike the scarcity that the federal government is facing, and they hope it will help government break some of its own traditions. Governments at all levels—city, county, state and federal—are all facing revenue challenges, but we can’t let that stop progress and change.
It takes a lot more than data and technology to improve the way government operates and engages with its citizens; it demands vision and leadership. We need innovators who can break traditions and make the future come alive through collaboration, ideas, and experiments.”
Open data: Strategies for impact
Important though these considerations are, they miss what should be an obvious and more profound alternative.
Right now, organisations like DataKind™ and Periscopic, and many other entrepreneurs, innovators and established social enterprises that use open data, see things differently. They are using these straplines to shake up the status quo, to demonstrate that data-driven businesses can do well by doing good.
And it’s the confluence of the many national and international open data initiatives, and the growing number of technically able, socially responsible organisations that provide the opportunity for social as well as economic growth. The World Wide Web Foundation now estimates that there are over 370 open data initiatives around the world. Collectively, and through portals such as Quandl and and datacatalogs.org, these initiatives have made a staggering quantity of data available – in excess of eight million data sets. In addition, several successful and data-rich companies are entering into a new spirit of philanthropy – by donating their data for the public good. There’s no doubt that opening up data signals a new willingness by governments and businesses all over the world to engage with their citizens and customers in a new and more transparent way.
The challenge, though, is ensuring that these popular national and international open data initiatives are cohesive and impactful. And that the plans drawn up by public sector bodies to release specific data sets are based on the potential the data has to achieve a beneficial outcome, not – or, at least, not solely – based on the cost or ease of publication. Despite the best of intentions, only a relatively small proportion of open data sets now available has the latent potential to create significant economic or social impact. In our push to open up data and government, it seems that we may have fallen into the trap of believing the ends are the same as the means; that effect is the same as cause…”