Chicago: Increase and improve City data


Initiative 14 of the Chicago Tech Plan:  “The City will continue to increase and improve the quality of City data available internally and externally, and facilitate methods for analyzing that data to help create a smarter and more efficient city.”
Releasing data is a crucial component of creating an open and transparent government. Chicago is currently a leader in open data, capturing and publishing more than 400 machine-readable datasets to date. In 2012, Mayor Emanuel issued an executive order ensuring that the City continues to release new data, and empowering the Chief Data Officer to work with other City departments and agencies to develop new datasets. The City is following an aggressive schedule for releasing new datasets to the public and updating existing sets. It is also working to facilitate ways the City and others can use data to help improve City operations.
Chicago Shovels Plow Tracker
Source: https://web.archive.org/web/2000/https://www.chicago.gov/city/en/depts/mayor/iframe/plow_tracker.html
 


Open Data Success Story: ChicagoWorks
A collaboration between Alderman Ameya Pawar and local graphic design company 2pensmedia, ChicagoWorks is a free app that is changing the way Chicagoans interact with government. Using the app, residents can submit service requests directly to 311
and track the progress of reported issues. So far, more than 3,000 residents have downloaded the app.18


Open Data Success Story: SpotHero and Techstars Chicago
The app SpotHero makes residents’ lives easier by helping them find and reserve parking spots online. Developed in Chicago, the app had its start at Excelerate Labs, a Chicago start-up accelerator, now Techstars Chicago, that provides mentorship, training, and networking opportunities to 10 selected start-ups each year. After graduating from the program, ranked as one of the top 3 accelerators nationally, SpotHero attracted $2.5 million in VC funding. With this funding, the company is hiring new staff working to expand to other cities.19


Open Data Success Story: OpenGov Hack Night
Chicago boasts a community of “civic hackers” who are passionate about using technology to improve the city. An example of this passion in action is the OpenGov Hack Night. Organized by Open City, an organization that builds web apps and other tools using open government data, the Hack Night attracts civic hackers and curious residents eager to explore the intersection of open government data, smart cities, and technology. Every week, the Hack Night provides a collaborative environment for residents to learn about open data, working on cutting-edge projects and networking with passionate civic technologists.20

Bright Spots of open government to be recognised at global summit


Press Release of the UK Cabinet Office: “The 7 shortlisted initiatives vying for the Bright Spots award show how governments in Open Government Partnership countries are working with citizens to sharpen governance, harness new technologies to increase public participation and improve government responsiveness.
At the Open Government Partnership summit in London on 31 October 2013 and 1 November 2013, participants will be able to vote for one of the shortlisted projects. The winning project – the Bright Spot – will be announced in the summit’s final plenary session….
The shortlisted entries for the Bright Spots prize – which will be awarded at the London summit – are:

  • Chile – ChileAtiende

The aim of ChileAtiende has been to simplify government to citizens by providing a one-stop shop for accessing public services. Today, ChileAtiende has more than 190 offices across the whole country, a national call centre and a digital platform, through which citizens can access multiple services and benefits without having to navigate multiple government offices.

  • Estonia – People’s Assembly

The People’s Assembly is a deliberative democracy tool, designed to encourage input from citizens on the government’s legislative agenda. This web-based platform allows ordinary citizens to propose policy solutions to problems including fighting corruption. Within 3 weeks, 1,800 registered users posted nearly 6,000 ideas and comments. Parliament has since set a timetable for the most popular proposals to be introduced in the formal proceedings.

  • Georgia – improvements to the Freedom of Information Act

Civil society organisations in Georgia have successfully used the government’s participation in OGP to advocate improvements to the country’s Freedom of Information legislation. Government agencies are now obliged to proactively publish information in a way that is accessible to anyone, and to establish an electronic request system for information.

  • Indonesia – complaints portal

LAPOR! (meaning “to report” in Indonesian) is a social media channel where Indonesian citizens can submit complaints and enquiries about development programmes and public services. Comments are transferred directly to relevant ministries or government agencies, which can respond via the website. LAPOR! now has more than 225,350 registered users and receives an average of 1,435 inputs per day.

  • Montenegro – Be Responsible app

“Be Responsible” is a mobile app that allows citizens to report local problems – from illegal waste dumps, misuse of official vehicles and irregular parking, to failure to comply with tax regulations and issues over access to healthcare and education.

  • Philippines – citizen audits

The Citizen Participatory Audit (CPA) project is exploring ways in which citizens can be directly engaged in the audit process for government projects and contribute to ensuring greater efficiency and effectiveness in the use of public resources. 4 pilot audits are in progress, covering public works, welfare, environment and education projects.

  • Romania – transparency in public sector recruitment

The PublicJob.ro website was set up to counter corruption and lack of transparency in civil service recruitment. PublicJob.ro takes recruitment data from public organisations and e-mails it to more than 20,000 subscribers in a weekly newsletter. As a result, it has become more difficult to manipulate the recruitment process.”

Talking About a (Data) Revolution


Dave Banisar at Article 19: “It is important to recognize the utility that data can bring. Data can ease analysis, reveal important patterns and facilitate comparisons. For example, the Transactional Access Clearing House (TRAC – http://www.trac.org) at Syracuse University uses data sets from the US Department of Justice to analyze how the federal government enforces its criminal and civil laws, showing how laws are applied differently across the US.
The (somewhat ICT-companies manufactured) excitement over “E-government” in the late 1990s imagined a brave new e-world where governments would quickly and easily provide needed information and services to their citizens. This was presented as an alternative to the “reactive” and “confrontational” right to information laws but eventually led to the realization that ministerial web pages and the ability to pay tickets online did not lead to open government. Singapore ranks near the top every year on e-government but is clearly not an ‘open government’. Similarly, it is important to recognize that governments providing data through voluntary measures is not enough.
For open data to promote open government, it needs to operate within a framework of law and regulation that ensures that information is collected, organized and stored and then made public in a timely, accurate and useful form.   The information must be more than just what government bodies find useful to release, but what is important for the public to know to ensure that those bodies are accountable.
Otherwise, it is in danger of just being propaganda, subject to manipulation to make government bodies look good. TRAC has had to sue the USA federal government dozens of times under the Freedom of Information Act to obtain the government data and after they publish it, some government bodies still claim that the information is incorrect.  Voluntary systems of publication usually fail when they potentially embarrass the bodies doing the publication.
In the countries where open data has been most successful such as the USA and UK, there also exists a legal right to demand information which keeps bodies honest. Most open government laws around the world now have requirements for affirmative publication of key information and they are slowly being amended to include open data requirements to ensure that the information is more easily usable.
Where there is no or weak open government laws, many barriers can obstruct open data. In Kenya, which has been championing their open data portal while being slow to adopt a law on freedom of information, a recent review found that the portal was stagnating. In part, the problem was that in the absence of laws mandating openness, there remains a culture of secrecy and fear of releasing information.
Further, mere access to data is not enough to ensure informed participation by citizens and enable their ability to affect decision-making processes.  Legal rights to all information held by governments – right to information laws – are essential to tell the “why”. RTI reveals how and why decisions and policy are made – secret meetings, questionable contracts, dubious emails and other information. These are essential elements for oversight and accountability. Being able to document why a road was built for political reasons is as crucial for change as recognizing that it’s in the wrong place. The TRAC users, mostly journalists, use the system as a starting point to ask questions or why enforcement is so uneven or taxes are not being collected. They need sources and open government laws to ask these questions.
Of course, even open government laws are not enough. There needs to be strong rights for citizen consultation and participation and the ability to enforce those rights, such as is mandated by the UNECE Convention on Access to Environment Information, Public Participation and Access to Justice (Aarhus Convention). A protocol to that convention has led to a Europe-wide data portal on environmental pollution.
For open data to be truly effective, there needs to be a right to information enshrined in law that requires that information is made available in a timely, reliable format that people want, not just what the government body wants to release. And it needs to be backed up with rights of engagement and participation. From this open data can flourish.  The OGP needs to refocus on the building blocks of open government – good law and policy – and not just the flashy apps.”

And Data for All: On the Validity and Usefulness of Open Government Data


Paper presented at the the 13th International Conference on Knowledge Management and Knowledge Technologies: “Open Government Data (OGD) stands for a relatively young trend to make data that is collected and maintained by state authorities available for the public. Although various Austrian OGD initiatives have been started in the last few years, less is known about the validity and the usefulness of the data offered. Based on the data-set on Vienna’s stock of trees, we address two questions in this paper. First of all, we examine the quality of the data by validating it according to knowledge from a related discipline. It shows that the data-set we used correlates with findings from meteorology. Then, we explore the usefulness and exploitability of OGD by describing a concrete scenario in which this data-set can be supportive for citizens in their everyday life and by discussing further application areas in which OGD can be beneficial for different stakeholders and even commercially used.”

Five Ways to Make Government Procurement Better


Mark Headd at Civic Innovations:  “Nothing in recent memory has focused attention on the need for wholesale reform of the government IT procurement system more than the troubled launch of healthcare.gov.
There has been a myriad of blog posts, stories and articles written in the last few weeks detailing all of the problems that led to the ignominious launch of the website meant to allow people to sign up for health care coverage.
Though the details of this high profile flop are in the latest headlines, the underlying cause has been talked about many times before – the process by which governments contract with outside parties to obtain IT services is broken…
With all of this in mind, here are – in no particular order – five suggested changes that can be adopted to improve the government procurement process.
Raise the threshold on simplified / streamlined procurement
Many governments use a separate, more streamlined process for smaller projects that do not require a full RFP (in the City of Philadelphia, professional services projects that do not exceed $32,000 annually go through this more streamlined bidding process). In Philadelphia, we’ve had great success in using these smaller projects to test new ideas and strategies for partnering with IT vendors. There is much we can learn from these experiments, and a modest increase to enable more experimentation would allow governments to gain valuable new insights.
Narrowing the focus of any enhanced thresholds for streamlined budding to web-based projects would help mitigate risk and foster a quicker process for testing new ideas.
Identify clear standards for projects
Having a clear set of vendor-agnostic IT standards to use when developing RFPs and in performing work can make a huge difference in how a project turns out. Clearly articulating standards for:

  • The various components that a system will use.
  • The environment in which it will be housed.
  • The testing it must undergo prior to final acceptance.

…can go a long way to reduce the risk an uncertainly inherent in IT projects.
It’s worth noting that most governments probably already have a set of IT standards that are usually made part of any IT solicitation. But these standards documents can quickly become out of date – they must undergo constant review and refinement. In addition, many of the people writing these standards may confuse a specific vendor product or platform with a true standard.
Require open source
Requiring that IT projects be open source during development or after completion can be an effective way to reduce risk on an IT project and enhance transparency. This is particularly true of web-based projects.
In addition, government RFPs should encourage the use of existing open source tools – leveraging existing software components that are in use in similar projects and maintained by an active community – to foster external participation by vendors and volunteers alike. When governments make the code behind their project open source, they enable anyone that understands software development to help make them better.
Develop a more robust internal capacity for IT project management and implementation
Governments must find ways to develop the internal capacity for developing, implementing and managing technology projects.
Part of the reason that governments make use of a variety of different risk mitigation provisions in public bidding is that there is a lack of people in government with hands on experience building or maintaining technology. There is a dearth of makers in government, and there is a direct relationship between the perceived risk that governments take on with new technology projects and the lack of experienced technologists working in government.
Governments need to find ways to develop a maker culture within their workforces and should prioritize recruitment from the local technology and civic hacking communities.
Make contracting, lobbying and campaign contribution data public as open data
One of the more disheartening revelations to come out of the analysis of healthcare.gov implementation is that some of the firms that were awarded work as part of the project also spent non-trivial amounts of money on lobbying. It’s a good bet that this kind of thing also happens at the state and local level as well.
This can seriously undermine confidence in the bidding process, and may cause many smaller firms – who lack funds or interest in lobbying elected officials – to simply throw up their hands and walk away.
In the absence of statutory or regulatory changes to prevent this from happening, governments can enhance the transparency around the bidding process by working to ensure that all contracting data as well as data listing publicly registered lobbyists and contributions to political campaigns is open.
Ensuring that all prospective participants in the public bidding process have confidence that the process will be fair and transparent is essential to getting as many firms to participate as possible – including small firms more adept at agile software development methodologies. More bids typically equates to higher quality proposals and lower prices.
None of the changes list above will be easy, and governments are positioned differently in how well they may achieve any one of them. Nor do they represent the entire universe of things we can do to improve the system in the near term – these are items that I personally think are important and very achievable.
One thing that could help speed the adoption of these and other changes is the development of robust communication framework between government contracting and IT professionals in different cities and different states. I think a “Municipal Procurement Academy” could go a long way toward achieving this.”

Collaborative Internet Governance: Terms and Conditions of Analysis


New paper by Mathieu O’Neil in the special issue on Contested Internet Governance of the Revue française d’études américaines: “Online projects are communities of practice which attempt to bypass the hierarchies of everyday life and to create autonomous institutions and forms of organisation. A wealth of theoretical frameworks have been put forward to account for these networked actors’ capacity to communicate and self-organise. This article reviews terminology used in Internet research and assesses what it implies for the understanding of regulatory-oriented collective action. In terms of the environment in which interpersonal communication occurs, what differences does it make to speak of “public spheres” or of “public spaces”? In terms of social formations, of “organisations” or “networks”? And in terms of the diffusion of information over the global network, of “contagion” or “trajectories”? Selecting theoretical frames is a momentous decision for researchers, as it authorises or forbids the analysis of different types of behaviour and practices”.-
Other papers on Internet Governance in the Revue:
Divina Frau-Meigs  (Ed.).  Conducting Research on the Internet and its Governance
The Internet and its Governance: A General Bibliography
Glossary of Key Terms and Notions about Internet Governance
Julia Pohle et Luciano Morganti   The Internet Corporation for Assigned Names and Numbers (ICANN): Origins, Stakes and Tensions
Francesca Musiani et al.   Net Neutrality as an Internet Governance Issue: The Globalization of an American-Born Debate
Jeanette Hofmann   Narratives of Copyright Enforcement: The Upward Ratchet and the Sleeping Giant
Elizabeth Dubois et William H. Dutton   The Fifth Estate in Internet Governance: Collective Accountability of a Canadian Policy Initiative
Mathieu O’Neil   Collaborative Internet Governance: Terms and Conditions of Analysis
Peng Hwa Ang et Natalie Pang  Globalization of the Internet, Sovereignty or Democracy: The Trilemma of the Internet Governance Forum

Special issue of FirstMonday: "Making data — Big data and beyond"


Introduction by Rasmus Helles and Klaus Bruhn Jensen: “Data are widely understood as minimal units of information about the world, waiting to be found and collected by scholars and other analysts. With the recent prominence of ‘big data’ (Mayer–Schönberger and Cukier, 2013), the assumption that data are simply available and plentiful has become more pronounced in research as well as public debate. Challenging and reflecting on this assumption, the present special issue considers how data are made. The contributors take big data and other characteristic features of the digital media environment as an opportunity to revisit classic issues concerning data — big and small, fast and slow, experimental and naturalistic, quantitative and qualitative, found and made.
Data are made in a process involving multiple social agents — communicators, service providers, communication researchers, commercial stakeholders, government authorities, international regulators, and more. Data are made for a variety of scholarly and applied purposes, oriented by knowledge interests (Habermas, 1971). And data are processed and employed in a whole range of everyday and institutional contexts with political, economic, and cultural implications. Unfortunately, the process of generating the materials that come to function as data often remains opaque and certainly under–documented in the published research.
The following eight articles seek to open up some of the black boxes from which data can be seen to emerge. While diverse in their theoretical and topical focus, the articles generally approach the making of data as a process that is extended in time and across spatial and institutional settings. In the common culinary metaphor, data are repeatedly processed, rather than raw. Another shared point of attention is meta–data — the type of data that bear witness to when, where, and how other data such as Web searches, e–mail messages, and phone conversations are exchanged, and which have taken on new, strategic importance in digital media. Last but not least, several of the articles underline the extent to which the making of data as well as meta–data is conditioned — facilitated and constrained — by technological and institutional structures that are inherent in the very domain of analysis. Researchers increasingly depend on the practices and procedures of commercial entities such as Google and Facebook for their research materials, as illustrated by the pivotal role of application programming interfaces (API). Research on the Internet and other digital media also requires specialized tools of data management and analysis, calling, once again, for interdisciplinary competences and dialogues about ‘what the data show.’”
See Table of Contents

The Best American Infographics 2013


41DKY50w7vL._SX258_BO1,204,203,200_ New book by Gareth Cook:  “The rise of infographics across virtually all print and electronic media—from a striking breakdown of classic cocktails to a graphic tracking 200 influential moments that changed the world to visually arresting depictions of Twitter traffic—reveals patterns in our lives and our world in fresh and surprising ways. In the era of big data, where information moves faster than ever, infographics provide us with quick, often influential bursts of art and knowledge—on the environment, politics, social issues, health, sports, arts and culture, and more—to digest, to tweet, to share, to go viral.
The Best American Infographics captures the finest examples from the past year, including the ten best interactive infographics, of this mesmerizing new way of seeing and understanding our world.”
See also selection of some in Wired.
 

The transition towards transparency


Roland Harwood at the Open Data Institute Blog: “It’s a very exciting time for the field of open data, especially in the UK public sector which is arguably leading the world in this emerging discipline right now, in no small part thanks to the efforts to the Open Data Institute. There is a strong push to release public data and to explore new innovations that can be created as a result.
For instance, the Ordnance Survey have been leading the way with opening up half of their data for others to use, complemented by their GeoVation programme which provides support and incentive for external innovators to develop new products and services.
More recently the Technology Strategy Board have been working with the likes of NERC, Met Office, Environment Agency and other public agencies to help solve business problems using environmental data.
It goes without saying that data won’t leap up and create any value by itself any more than a pile of discarded parts outside a factory will assemble themselves into a car.   We’ve found that the secret of successful open data innovation is to be with people working to solve some specific problem.  Simply releasing the data is not enough. See below a summary of our Do’s and Don’ts of opening up data
Do…

  • Make sure data quality is high (ODI Certificates can help!)
  • Promote innovation using data sets. Transparency is only a means to an end
  • Enhance communication with external innovators
  • Make sure your co-creators are incentivised
  • Get organised, create a community around an issue
  • Pass on learnings to other similar organisations
  • Experiement – open data requires new mindsets and business models
  • Create safe spaces – Innovation Airlocks – to share and prototype with trusted partners
  • Be brave – people may do things with the data that you don’t like
  • Set out to create commercial or social value with data

Dont…

  • Just release data and expect people to understand or create with it. Publication is not the same as communication
  • Wait for data requests, put the data out first informally
  • Avoid challenges to current income streams
  • Go straight for the finished article, use rapid prototyping
  • Be put off by the tensions between confidentiality, data protection and publishing
  • Wait for the big budget or formal process but start big things with small amounts now
  • Be technology led, be business led instead
  • Expect the community to entirely self-manage
  • Restrict open data to the IT literate – create interdisciplinary partnerships
  • Get caught in the false dichotomy that is commercial vs. social

In summary we believe we need to assume openness as the default (for organisations that is, not individuals) and secrecy as the exception – the exact opposite to how most commercial organisations currently operate. …”

Using Participatory Crowdsourcing in South Africa to Create a Safer Living Environment


New Paper by Bhaveer Bhana, Stephen Flowerday, and Aharon Satt in the International Journal of Distributed Sensor Networks: “The increase in urbanisation is making the management of city resources a difficult task. Data collected through observations (utilising humans as sensors) of the city surroundings can be used to improve decision making in terms of managing these resources. However, the data collected must be of a certain quality in order to ensure that effective and efficient decisions are made. This study is focused on the improvement of emergency and non-emergency services (city resources) through the use of participatory crowdsourcing (humans as sensors) as a data collection method (collect public safety data), utilising voice technology in the form of an interactive voice response (IVR) system.
The study illustrates how participatory crowdsourcing (specifically humans as sensors) can be used as a Smart City initiative focusing on public safety by illustrating what is required to contribute to the Smart City, and developing a roadmap in the form of a model to assist decision making when selecting an optimal crowdsourcing initiative. Public safety data quality criteria were developed to assess and identify the problems affecting data quality.
This study is guided by design science methodology and applies three driving theories: the Data Information Knowledge Action Result (DIKAR) model, the characteristics of a Smart City, and a credible Data Quality Framework. Four critical success factors were developed to ensure high quality public safety data is collected through participatory crowdsourcing utilising voice technologies.”