Bright Spots of open government to be recognised at global summit
Press Release of the UK Cabinet Office: “The 7 shortlisted initiatives vying for the Bright Spots award show how governments in Open Government Partnership countries are working with citizens to sharpen governance, harness new technologies to increase public participation and improve government responsiveness.
At the Open Government Partnership summit in London on 31 October 2013 and 1 November 2013, participants will be able to vote for one of the shortlisted projects. The winning project – the Bright Spot – will be announced in the summit’s final plenary session….
The shortlisted entries for the Bright Spots prize – which will be awarded at the London summit – are:
- Chile – ChileAtiende
The aim of ChileAtiende has been to simplify government to citizens by providing a one-stop shop for accessing public services. Today, ChileAtiende has more than 190 offices across the whole country, a national call centre and a digital platform, through which citizens can access multiple services and benefits without having to navigate multiple government offices.
- Estonia – People’s Assembly
The People’s Assembly is a deliberative democracy tool, designed to encourage input from citizens on the government’s legislative agenda. This web-based platform allows ordinary citizens to propose policy solutions to problems including fighting corruption. Within 3 weeks, 1,800 registered users posted nearly 6,000 ideas and comments. Parliament has since set a timetable for the most popular proposals to be introduced in the formal proceedings.
- Georgia – improvements to the Freedom of Information Act
Civil society organisations in Georgia have successfully used the government’s participation in OGP to advocate improvements to the country’s Freedom of Information legislation. Government agencies are now obliged to proactively publish information in a way that is accessible to anyone, and to establish an electronic request system for information.
- Indonesia – complaints portal
LAPOR! (meaning “to report” in Indonesian) is a social media channel where Indonesian citizens can submit complaints and enquiries about development programmes and public services. Comments are transferred directly to relevant ministries or government agencies, which can respond via the website. LAPOR! now has more than 225,350 registered users and receives an average of 1,435 inputs per day.
- Montenegro – Be Responsible app
“Be Responsible” is a mobile app that allows citizens to report local problems – from illegal waste dumps, misuse of official vehicles and irregular parking, to failure to comply with tax regulations and issues over access to healthcare and education.
- Philippines – citizen audits
The Citizen Participatory Audit (CPA) project is exploring ways in which citizens can be directly engaged in the audit process for government projects and contribute to ensuring greater efficiency and effectiveness in the use of public resources. 4 pilot audits are in progress, covering public works, welfare, environment and education projects.
- Romania – transparency in public sector recruitment
The PublicJob.ro website was set up to counter corruption and lack of transparency in civil service recruitment. PublicJob.ro takes recruitment data from public organisations and e-mails it to more than 20,000 subscribers in a weekly newsletter. As a result, it has become more difficult to manipulate the recruitment process.”
Building a Smarter City
PSFK: “As cities around the world grow in size, one of the major challenges will be how to make city services and infrastructure more adaptive and responsive in order to keep existing systems running efficiently, while expanding to accommodate greater need. In our Future Of Cities report, PSFK Labs investigated the key trends and pressing issues that will play a role in shaping the evolution of urban environments over the next decade.
A major theme identified in the report is Sensible Cities, which is bringing intelligence to the city and its citizens through the free flow of information and data, helping improve both immediate and long term decision making. This theme consists of six key trends: Citizen Sensor Networks, Hyperlocal Reporting, Just-In-Time Alerts, Proximity Services, Data Transparency, and Intelligent Transport.
The Citizen Sensor Networks trend described in the Future Of Cities report highlights how sensor-laden personal electronics are enabling everyday people to passive collect environmental data and other information about their communities. When fed back into centralized, public databases for analysis, this accessible pool of knowledge enables any interested party to make more informed choices about their surroundings. These feedback systems require little infrastructure, and transform people into sensor nodes with little effort on their part. An example of this type of network in action is Street Bump, which is a crowdsourcing project that helps residents improve their neighborhood streets by collecting data around real-time road conditions while they drive. Using the mobile application’s motion- detecting accelerometer, Street Bump is able to sense when a bump is hit, while the phone’s GPS records and transmits the location.
The next trend of Hyperlocal Reporting describes how crowdsourced information platforms are changing the top-down nature of how news is gathered and disseminated by placing reporting tools in the hands of citizens, allowing any individual to instantly broadcast about what is important to them. Often using mobile phone technology, these information monitoring systems not only provide real-time, location specific data, but also boost civic engagement by establishing direct channels of communication between an individual and their community. A good example of this is Retio, which is a mobile application that allows Mexican citizens to report on organized crime and corruption using social media. Each issue is plotted on a map, allowing users and authorities to get an overall idea of what has been reported or narrow results down to specific incidents.
…
.
Data Transparency is a trend that examines how city administrators, institutions, and companies are publicly sharing data generated within their systems to add new levels of openness and accountability. Availability of this information not only strengthens civic engagement, but also establishes a collaborative agenda at all levels of government that empowers citizens through greater access and agency. For example, OpenSpending is a mobile and web-based application that allows citizens in participating cities to examine where their taxes are being spent through interactive visualizations. Citizens can review their personal share of public works, examine local impacts of public spending, rate and vote on proposed plans for spending and monitor the progress of projects that are or are not underway…”
GitHub and Government
New site: “Make government better, together. Stories of open source, open data, and open government.
This site is an open source effort to showcase best practices of open sourcing government. See something that you think could be better? Want to submit your own story? Simply fork the project and submit a pull request.
…
Ready to get started on GitHub? Here are some ideas that are easy to get your feet wet with.
Feedback Repository
GitHub’s about connecting with developers. Whether you’re an API publishing pro, or just getting started, creating a “feedback” repository can go a long way to connect your organization with the community. Get feedback from current and potential data consumers by creating a specific repository for them to contribute ideas and suggestions for types of data or other information they’d like to see opened. Here’s how:
- Create a new repository
- Choose your organization as the Owner
- Name the repository “feedback” or similar
- Click the checkbox to automatically create a
README.md
file
- Set up your Readme
- Click
README.md
within your newly created repository - Click
Edit
- Introduce yourself, describe why you’ve joined GitHub, what you’re hoping to do and what you’d like to learn from the development community. Encourage them to leave feedback through issues on the repository.
- Click
Sample text for your README.md
:
# City of Gotham Feedback
We've just joined GitHub and want to know what data would be interesting to our development community?
Leave us comments via issues!
Open source a Dataset
Open sourcing a dataset can be as simple as uploading a .csv
to GitHub and letting people know about it. Rather than publishing data as a zip file on your website or an FTP server, you can add the files through the GitHub.com web interface, or via the GitHub for Windows or GitHub for Mac native clients. Create a new repository to store your datasets – in many cases, it’s as easy as drag, drop, sync.
GitHub can host any file type (although open, non-binary files like .csv
s tend to work best). Plus, GitHub supports rendering certain open data formats interactively such as the popular geospacial .geojson
format. Once uploaded, citizens can view the files, and can even open issues or submit pull requests with proposed fixes.
Explore Open Source Civic Apps
There are many open source applications freely available on GitHub that were built just for government. Check them out, and see if it fits a need. Here are some examples:
- Adopt-a – This open source web app was created for the City of Boston in 2011 by Code for America fellows. It allows residents to “adopt” a hydrant and make sure it’s clear of snow in the winter so that emergency crews can locate them when needed. It has since been adopted in Chicago (for sidewalks), Seattle (for storm drains), and Honolulu (for tsunami sirens).
- StreetMix – Another creation of Code for America fellows (2013) this website, www.streetmix.net, allows anyone to create street sections in a way that is not only beautiful but educational, too. No downloading, no installing, no paying – make and save your creations right at the website. Great for internal or public community planning meetings.
- We The People – We The People, the White House’s petitions application hosted at petitions.whitehouse.gov is a Drupal module to allow citizens to submit and digitally sign petitions.
Open source something small
Chances are you’ve got something small you can open source. Check in with your web or new media team, and see if they’ve got something they’ve been dying to share or blog about, no matter how small. It can be snippet of analytics code, or maybe a small script used internally. It doesn’t even have to be code.
Post your website’s privacy policy, comment moderation policy, or terms of service and let the community weigh in before your next edit. No matter how small it is, getting your first open source project going is a great first step.
Improve an existing project
Does you agency use an existing open source project to conduct its own business? Open an issue on the project’s repository with a feature request or a bug you spot. Better yet, fork the project, and submit your improvements. Even if it’s one or two lines of code, such examples are great to blog about to showcase your efforts.
Don’t forget, this site is an open source project, too. Making an needed edit is another great way to get started.”
Free Software Ties the Internet of Things Together
Rachel Metz in MIT Technology Review: “OpenRemote is an open-source Internet of Things platform that could help spur smarter homes and cities.
If you buy several Internet-connected home gadgets—say, a “smart” thermostat, “smart” door lock, and “smart” window blinds—you’ll likely have to control each one with a separate app, meaning it exists in its own little silo.
That’s not how Elier Ramirez does it. In his home, an iPad app controls his lights, ceiling fans, and TV and stereo. Pressing a single button within the app can shut off all his lights and gadgets when he leaves.
Ramirez can tap a lamp in an image to turn an actual lamp off and on in his apartment, and at the same time he’ll see the picture on the tablet’s screen go dark or become illuminated. Ramirez also set up a presence-sensing feature that uses his cell phone to determine if he’s home (it checks whether or not he has connected to his home Wi-Fi network). This can automatically turn on the lights if he’s there. Ramirez runs the whole setup from a small computer in his home.
The software behind all this interconnection comes from a company called OpenRemote, which is plugging away on an open-source software platform for linking Internet-connected gadgets, making it easier to control all kinds of smart home devices, regardless of who made them. And it makes it easy to automate actions like lowering your connected window blinds if the temperature sensed in your living room goes above 75 degrees….
OpenRemote also sees a moneymaking opportunity beyond the home in providing its software to cities, which are becoming increasingly interested in using technology for everything from communicating with citizens to monitoring traffic. Last year, OpenRemote conducted a small test in Eindhoven, in hopes of using automation and crowdsourcing to monitor a city. This included people-tracking with cameras, sound-level tracking, social-media monitoring, and an app that people in the area could use to rate what the atmosphere was like. The company is currently working on a larger-scale project in Eindhoven, Kil says. “If you put four walls around a city, it’s a big room, if you know what I mean,” he says.”
Connecting Grassroots and Government for Disaster Response
New Report by John Crowley for the Wilson Center: “Leaders in disaster response are finding it necessary to adapt to a new reality. Although community actions have always been the core of the recovery process, collective action from the grassroots has changed response operations in ways that few would have predicted. Using new tools that interconnect over expanding mobile networks, citizens can exchange information via maps and social media, then mobilize thousands of people to collect, analyze, and act on that information. Sometimes, community-sourced intelligence may be fresher and more accurate than the information given to the responders who provide aid…
Also see the companion report from our September 2012 workshop, written by Ryan Burns and Lea Shanley, as well as a series of videos from the workshop and podcasts with workshop participants.”
Special issue of FirstMonday: "Making data — Big data and beyond"
Introduction by Rasmus Helles and Klaus Bruhn Jensen: “Data are widely understood as minimal units of information about the world, waiting to be found and collected by scholars and other analysts. With the recent prominence of ‘big data’ (Mayer–Schönberger and Cukier, 2013), the assumption that data are simply available and plentiful has become more pronounced in research as well as public debate. Challenging and reflecting on this assumption, the present special issue considers how data are made. The contributors take big data and other characteristic features of the digital media environment as an opportunity to revisit classic issues concerning data — big and small, fast and slow, experimental and naturalistic, quantitative and qualitative, found and made.
Data are made in a process involving multiple social agents — communicators, service providers, communication researchers, commercial stakeholders, government authorities, international regulators, and more. Data are made for a variety of scholarly and applied purposes, oriented by knowledge interests (Habermas, 1971). And data are processed and employed in a whole range of everyday and institutional contexts with political, economic, and cultural implications. Unfortunately, the process of generating the materials that come to function as data often remains opaque and certainly under–documented in the published research.
The following eight articles seek to open up some of the black boxes from which data can be seen to emerge. While diverse in their theoretical and topical focus, the articles generally approach the making of data as a process that is extended in time and across spatial and institutional settings. In the common culinary metaphor, data are repeatedly processed, rather than raw. Another shared point of attention is meta–data — the type of data that bear witness to when, where, and how other data such as Web searches, e–mail messages, and phone conversations are exchanged, and which have taken on new, strategic importance in digital media. Last but not least, several of the articles underline the extent to which the making of data as well as meta–data is conditioned — facilitated and constrained — by technological and institutional structures that are inherent in the very domain of analysis. Researchers increasingly depend on the practices and procedures of commercial entities such as Google and Facebook for their research materials, as illustrated by the pivotal role of application programming interfaces (API). Research on the Internet and other digital media also requires specialized tools of data management and analysis, calling, once again, for interdisciplinary competences and dialogues about ‘what the data show.’”
See Table of Contents
The move toward 'crowdsourcing' public safety
What is “crowdsourcing public safety” and why are public safety agencies moving toward this trend?
Crowdsourcing—the term coined by our own assistant professor of journalism Jeff Howe—involves taking a task or job traditionally performed by a distinct agent, or employee, and having that activity be executed by an “undefined, generally large group of people in an open call.” Crowdsourcing public safety involves engaging and enabling private citizens to assist public safety professionals in addressing natural disasters, terror attacks, organized crime incidents, and large-scale industrial accidents.
Public safety agencies have long recognized the need for citizen involvement. Tip lines and missing persons bulletins have been used to engage citizens for years, but with advances in mobile applications and big data analytics, the ability of public safety agencies to receive, process, and make use of high volume, tips, and leads makes crowdsourcing searches and investigations more feasible. You saw this in the FBI Boston Marathon Bombing web-based Tip Line. You see it in the “See Something Say Something” initiatives throughout the country. You see it in AMBER alerts or even remote search and rescue efforts. You even see it in more routine instances like Washington State’s HERO program to reduce traffic violations.
Have these efforts been successful, and what challenges remain?
There are a number of issues to overcome with regard to crowdsourcing public safety—such as maintaining privacy rights, ensuring data quality, and improving trust between citizens and law enforcement officers. Controversies over the National Security Agency’s surveillance program and neighborhood watch programs – particularly the shooting death of teenager Trayvon Martin by neighborhood watch captain George Zimmerman, reflect some of these challenges. It is not clear yet from research the precise set of success criteria, but those efforts that appear successful at the moment have tended to be centered around a particular crisis incident—such as a specific attack or missing person. But as more crowdsourcing public safety mobile applications are developed, adoption and use is likely to increase. One trend to watch is whether national public safety programs are able to tap into the existing social networks of community-based responders like American Red Cross volunteers, Community Emergency Response Teams, and United Way mentors.
The move toward crowdsourcing public safety is part of an overall trend toward improving community resilience, which refers to a system’s ability to bounce back after a crisis or disturbance. Stephen Flynn and his colleagues at Northeastern’s George J. Kostas Research Institute for Homeland Security are playing a key role in driving a national conversation in this area. Community resilience is inherently multi-disciplinary, so you see research being done regarding transportation infrastructure, social media use after a crisis event, and designing sustainable urban environments. Northeastern is a place where use-inspired research is addressing real-world problems. It will take a village to improve community resilience capabilities, and our institution is a vital part of thought leadership for that village.”
Twitter Datastream Used to Predict Flu Outbreaks
arXivBlog: “The rate at which people post flu-related tweets could become a powerful tool in the battle to spot epidemics earlier, say computer scientists.

The predictions are pretty good. The data generally closely matches that produced by government organisations such as the Centers for Disease Control and Prevention (CDC) in the US. Indeed, in some cases, it has been able to spot an incipient epidemic more than a week before the CDC.
That’s been hugely important. An early indication that the disease is spreading in a population gives governments a welcome headstart in planning its response.
So an interesting question is whether other online services, in particular social media, can make similar or even better predictions. Today, we have an answer thanks to the work of Jiwei Li at Carnegie Mellon University in Pittsburgh, and Claire Cardie at Cornell University in New York State, who have been able to detect the early stages of an influenza outbreak using Twitter.
Their approach is in many ways similar to Google’s. They simply filter the Twitter datastream for flu-related tweets that are also geotagged. That allows them to create a map showing the distribution of these tweets and how it varies over time.
They also model the dynamics of the disease with some interesting subtleties. In the new model, a flu epidemic can be in one of four phases: non-epidemic phase, a rising phase where numbers are increasing, a stationary phase and a declining phase where numbers are falling.
The new approach uses an algorithm that attempts to spot the switch from one phase to another as early as possible. Indeed, Li and Cardie test the effectiveness of their approach using a Twitter dataset of 3.6 million flu-related tweets from about 1 million people in the US between June 2008 and June 2010…
Ref: arxiv.org/abs/1309.7340: Early Stage Influenza Detection from Twitter”
Data Discrimination Means the Poor May Experience a Different Internet
MIT Technology Review: “Data analytics are being used to implement a subtle form of discrimination, while anonymous data sets can be mined to reveal health data and other private information, a Microsoft researcher warned this morning at MIT Technology Review’s EmTech conference.
Kate Crawford, principal researcher at Microsoft Research, argued that these problems could be addressed with new legal approaches to the use of personal data.
In a new paper, she and a colleague propose a system of “due process” that would give people more legal rights to understand how data analytics are used in determinations made against them, such as denial of health insurance or a job. “It’s the very start of a conversation about how to do this better,” Crawford, who is also a visiting professor at the MIT Center for Civic Media, said in an interview before the event. “People think ‘big data’ avoids the problem of discrimination, because you are dealing with big data sets, but in fact big data is being used for more and more precise forms of discrimination—a form of data redlining.”
During her talk this morning, Crawford added that with big data, “you will never know what those discriminations are, and I think that’s where the concern begins.”