Online public services and Design Thinking for governments


Ela Alptekin: “The digital era has changed the expectations citizens have regarding the communication of public services and their engagement with government agencies. ‘Digital Citizenship’ is common place and this is a great opportunity for institutions to explore the benefits this online presence offers.

Most government agencies have moved their public services to digital platforms by applying technology to the exact same workflow they had earlier. They’ve replaced hard copies with emails and signatures with digital prints. However, Information Technologies don’t just improve the efficiency of governments, they also have the power to transform how governments work by redefining their engagement with citizens. With this outlook they can expand the array of services that could be provided and implemented.

When it comes to online public services there are two different paths to building-up a strategy: Governments can either: Use stats, trends and quantitative surveys to measure and produce “reliable results”; or they can develop a deeper understanding of the basic needs of their consumers for a specific problem. With that focus, they may propose a solid solution that would satisfy those needs.

Two of the primary criteria of evaluation in any measurement or observation are:

  1. Does the same measurement process yields the same results?

  2. Are we measuring what we intend to measure?

These two concepts are reliability and validity.

According to Roger Martin, author of “The Design of Business”, truly innovative organisations are those that have managed to balance the “reliability” of analytical thinking with the “validity” of abductive thinking. Many organisations often don’t find this balance between reliability and validity and choose only the reliable data to move on with their future implementations.

So what is the relationship between reliability and validity? The two do not necessarily go hand-in-hand.

At best, we have a measure that has both high validity and high reliability. It yields consistent results in repeated application and it accurately reflects what we hope to represent.

It is possible to have a measure that has high reliability but low validity – one that is consistent in getting bad information or consistent in missing the mark. *It is also possible to have one that has low reliability and low validity – inconsistent and not on target.

Finally, it is not possible to have a measure that has low reliability and high validity – you can’t really get at what you want or what you’re interested in if your measure fluctuates wildly.” – click here for further reading.

Many online, government, public services are based on reliable data and pay no attention to the validity of the results ( 1st figure “reliable but not valid” ).

What can government agencies use to balance the reliability and validity when it comes to public services? The answer is waiting in Design Thinking and abductive reasoning.

….Design thinking helps agencies to go back to the basics of what citizens need from their governments. It can be used to develop both reliable and valid online public services that are able to satisfy their needs….

As Government accelerates towards a world of public services that are digital by default, is this going to deliver the kind of digital services that move the public with them?

To find out, thinkpublic partnered with Consumer Focus (UK) to undertake detailed research into some of the fundamental questions and issues that users of digital public services are interested in. The findings have been published today in the Manifesto for Online Public Services, which sets out simple guiding principles to be placed at the heart of online service design.”

The transition towards transparency


Roland Harwood at the Open Data Institute Blog: “It’s a very exciting time for the field of open data, especially in the UK public sector which is arguably leading the world in this emerging discipline right now, in no small part thanks to the efforts to the Open Data Institute. There is a strong push to release public data and to explore new innovations that can be created as a result.
For instance, the Ordnance Survey have been leading the way with opening up half of their data for others to use, complemented by their GeoVation programme which provides support and incentive for external innovators to develop new products and services.
More recently the Technology Strategy Board have been working with the likes of NERC, Met Office, Environment Agency and other public agencies to help solve business problems using environmental data.
It goes without saying that data won’t leap up and create any value by itself any more than a pile of discarded parts outside a factory will assemble themselves into a car.   We’ve found that the secret of successful open data innovation is to be with people working to solve some specific problem.  Simply releasing the data is not enough. See below a summary of our Do’s and Don’ts of opening up data
Do…

  • Make sure data quality is high (ODI Certificates can help!)
  • Promote innovation using data sets. Transparency is only a means to an end
  • Enhance communication with external innovators
  • Make sure your co-creators are incentivised
  • Get organised, create a community around an issue
  • Pass on learnings to other similar organisations
  • Experiement – open data requires new mindsets and business models
  • Create safe spaces – Innovation Airlocks – to share and prototype with trusted partners
  • Be brave – people may do things with the data that you don’t like
  • Set out to create commercial or social value with data

Dont…

  • Just release data and expect people to understand or create with it. Publication is not the same as communication
  • Wait for data requests, put the data out first informally
  • Avoid challenges to current income streams
  • Go straight for the finished article, use rapid prototyping
  • Be put off by the tensions between confidentiality, data protection and publishing
  • Wait for the big budget or formal process but start big things with small amounts now
  • Be technology led, be business led instead
  • Expect the community to entirely self-manage
  • Restrict open data to the IT literate – create interdisciplinary partnerships
  • Get caught in the false dichotomy that is commercial vs. social

In summary we believe we need to assume openness as the default (for organisations that is, not individuals) and secrecy as the exception – the exact opposite to how most commercial organisations currently operate. …”

Digital Participation – The Case of the Italian 'Dialogue with Citizens'


New paper by Gianluca Sgueo presented at Democracy and Technology – Europe in Tension from the 19th to the 21th Century – Sorbonne Paris, 2013: “This paper focuses on the initiative named “Dialogue With Citizens” that the Italian Government introduced in 2012. The Dialogue was an entirely web-based experiment of participatory democracy aimed at, first, informing citizens through documents and in-depth analysis and, second, designed for answering to their questions and requests. During the year and half of life of the initiative roughly 90.000 people wrote (approximately 5000 messages/month). Additionally, almost 200.000 participated in a number of public online consultations that the government launched in concomitance with the adoption of crucial decisions (i.e. the spending review national program).
From the analysis of this experiment of participatory democracy three questions can be raised. (1) How can a public institution maximize the profits of participation and minimize its costs? (2) How can public administrations manage the (growing) expectations of the citizens once they become accustomed to participation? (3) Is online participatory democracy going to develop further, and why?
In order to fully answer such questions, the paper proceeds as follows: it will initially provide a general overview of online public participation both at the central and the local level. It will then discuss the “Dialogue with Citizens” and a selected number of online public consultations lead by the Italian government in 2012. The conclusions will develop a theoretical framework for reflection on the peculiarities and problems of the web-participation.”

Mobile phone data are a treasure-trove for development


Paul van der Boor and Amy Wesolowski in SciDevNet: “Each of us generates streams of digital information — a digital ‘exhaust trail’ that provides real-time information to guide decisions that affect our lives. For example, Google informs us about traffic by using both its ‘My Location’ feature on mobile phones and third-party databases to aggregate location data. BBVA, one of Spain’s largest banks, analyses transactions such as credit card payments as well as ATM withdrawals to find out when and where peak spending occurs.This type of data harvest is of great value. But, often, there is so much data that its owners lack the know-how to process it and fail to realise its potential value to policymakers.
Meanwhile, many countries, particularly in the developing world, have a dearth of information. In resource-poor nations, the public sector often lives in an analogue world where piles of paper impede operations and policymakers are hindered by uncertainty about their own strengths and capabilities.Nonetheless, mobile phones have quickly pervaded the lives of even the poorest: 75 per cent of the world’s 5.5 billion mobile subscriptions are in emerging markets. These people are also generating digital trails of anything from their movements to mobile phone top-up patterns. It may seem that putting this information to use would take vast analytical capacity. But using relatively simple methods, researchers can analyse existing mobile phone data, especially in poor countries, to improve decision-making.
Think of existing, available data as low-hanging fruit that we — two graduate students — could analyse in less than a month. This is not a test of data-scientist prowess, but more a way of saying that anyone could do it.
There are three areas that should be ‘low-hanging fruit’ in terms of their potential to dramatically improve decision-making in information-poor countries: coupling healthcare data with mobile phone data to predict disease outbreaks; using mobile phone money transactions and top-up data to assess economic growth; and predicting travel patterns after a natural disaster using historical movement patterns from mobile phone data to design robust response programmes.
Another possibility is using call-data records to analyse urban movement to identify traffic congestion points. Nationally, this can be used to prioritise infrastructure projects such as road expansion and bridge building.
The information that these analyses could provide would be lifesaving — not just informative or revenue-increasing, like much of this work currently performed in developed countries.
But some work of high social value is being done. For example, different teams of European and US researchers are trying to estimate the links between mobile phone use and regional economic development. They are using various techniques, such as merging night-time satellite imagery from NASA with mobile phone data to create behavioural fingerprints. They have found that this may be a cost-effective way to understand a country’s economic activity and, potentially, guide government spending.
Another example is given by researchers (including one of this article’s authors) who have analysed call-data records from subscribers in Kenya to understand malaria transmission within the country and design better strategies for its elimination. [1]
In this study, published in Science, the location data of the mobile phones of more than 14 million Kenyan subscribers was combined with national malaria prevalence data. After identifying the sources and sinks of malaria parasites and overlaying these with phone movements, analysis was used to identify likely transmission corridors. UK scientists later used similar methods to create different epidemic scenarios for the Côte d’Ivoire.”

5 Ways Cities Are Using Big Data


Eric Larson in Mashable: “New York City released more than 200 high-value data sets to the public on Monday — a way, in part, to provide more content for open-sourced mapping projects like OpenStreetMap.
It’s one of the many releases since the Local Law 11 of 2012 passed in February, which calls for more transparency of the city government’s collected data.
But it’s not just New York: Cities across the world, large and small, are utilizing big data sets — like traffic statistics, energy consumption rates and GPS mapping — to launch projects to help their respective communities.
We rounded up a few of our favorites below….

1. Seattle’s Power Consumption

The city of Seattle recently partnered with Microsoft and Accenture on a pilot project to reduce the area’s energy usage. Using Microsoft’s Azure cloud, the project will collect and analyze hundreds of data sets collected from four downtown buildings’ management systems.
With predictive analytics, then, the system will work to find out what’s working and what’s not — i.e. where energy can be used less, or not at all. The goal is to reduce power usage by 25%.

2. SpotHero

Finding parking spots — especially in big cities — is undoubtably a headache.

SpotHero is an app, for both iOS and Android devices, that tracks down parking spots in a select number of cities. How it works: Users type in an address or neighborhood (say, Adams Morgan in Washington, D.C.) and are taken to a listing of available garages and lots nearby — complete with prices and time durations.
The app tracks availability in real-time, too, so a spot is updated in the system as soon as it’s snagged.
Seven cities are currently synced with the app: Washington, D.C., New York, Chicago, Baltimore, Boston, Milwaukee and Newark, N.J.

3. Adopt-a-Hydrant

Anyone who’s spent a winter in Boston will agree: it snows.

In January, the city’s Office of New Urban Mechanics released an app called Adopt-a-Hydrant. The program is mapped with every fire hydrant in the city proper — more than 13,000, according to a Harvard blog post — and lets residents pledge to shovel out one, or as many as they choose, in the almost inevitable event of a blizzard.
Once a pledge is made, volunteers receive a notification if their hydrant — or hydrants — become buried in snow.

4. Adopt-a-Sidewalk

Similar to Adopt-a-Hydrant, Chicago’s Adopt-a-Sidewalk app lets residents of the Windy City pledge to shovel sidewalks after snowfall. In a city just as notorious for snowstorms as Boston, it’s an effective way to ensure public spaces remain free of snow and ice — especially spaces belonging to the elderly or disabled.

If you’re unsure which part of town you’d like to “adopt,” just register on the website and browse the map — you’ll receive a pop-up notification for each street you swipe that’s still available.

5. Less Congestion for Lyon

Last year, researchers at IBM teamed up with the city of Lyon, France (about four hours south of Paris), to build a system that helps traffic operators reduce congestion on the road.

The system, called the “Decision Support System Optimizer (DSSO),” uses real-time traffic reports to detect and predict congestions. If an operator sees that a traffic jam is likely to occur, then, she/he can adjust traffic signals accordingly to keep the flow of cars moving smoothly.
It’s an especially helpful tool for emergencies — say, when an ambulance is en route to the hospital. Over time, the algorithms in the system will “learn” from its most successful recommendations, then apply that knowledge when making future predictions.”

UK: Good law


“The good law initiative is an appeal to everyone interested in the making and publishing of law to come together with a shared objective of making legislation work well for the users of today and tomorrow…People find legislation difficult. The volume of statutes and regulations, their piecemeal structure, and their level of detail and frequent amendments, make legislation hard to understand and difficult to comply with. That can hinder economic activity. It can create burdens for businesses and communities. It can obstruct good government, and it can undermine the rule of law…
Good law is not a checklist, or a call for more process. It straddles four areas that have traditionally been regarded as separate domains. We think that they are inter-connected, and we invite good law partners to consider each of them from the different perspectives of citizens, professional users and legislators.

Diagram with 'Good law' in the centre and how it interacts with the following questions: How much detail?  Is this law necessary?  Does it duplicate, or conflict with, another law? Do we know what the likely readership is? Is the language easy to understa
Good law from the perspectives of citizens, professional users and legislators…
Good law sits naturally alongside:

Participatory Budgeting Around the World


Jay Colburn, from the International Budget Partnership:  “Public participation in budget decision making can occur in many different forms. Participatory budgeting (PB) is an increasingly popular process in which the public is involved directly in making budgetary decisions, most often at the local level. The involvement of community members usually includes identifying and prioritizing the community’s needs and then voting on spending for specific projects.
PB was first developed in Porto Alegre, Brazil, in 1989 as an innovative reform to address the city’s severe inequality. Since then it has spread around the world. Though the specifics of how the PB process works varies depending on the context in which it is implemented, most PB processes have four basic similarities: 1) community members identify spending ideas; 2) delegates are selected to develop spending proposals based on those ideas; 3) residents vote on which proposals to fund; and 4) the government implements the chosen proposals.
During the 1990s PB spread throughout Brazil and across Latin America. Examples of participatory budgeting can now be found in every region of the world, including Central Asia, Europe, and the Middle East. As the use of PB has expanded, it has been adapted in many ways. One example is to incorporate new information and communication technologies as a way to broaden opportunities for participation (see Using Technology to Improve Transparency and Citizen Engagement in this newsletter for more on this topic.)…
There are also a number of different models of PB that have been developed, each with slightly different rules and processes. Using the different models and methods has expanded our knowledge on the potential impacts of PB. In addition to having demonstrable and measurable results on mobilizing public funds for services for the poor, participatory budgeting has also been linked to greater tax compliance, increased demands for transparency, and greater access to budget information and oversight.
However, not all instances of PB are equally successful; there are many variables to consider when weighing the impact of different cases. These can include the level and mechanisms of participation, information accessibility, knowledge of opportunities to participate, political context, and prevailing socioeconomic factors. There is a large and growing literature on the benefits and challenges of PB. The IBP Open Budgets Blog recently featured posts on participatory budgeting initiatives in Peru, Kyrgyzstan, and Kenya. While there are still many lessons to be learned about how PB can be used in different contexts, it is certainly a positive step toward increased citizen engagement in the budget process and influence over how public funds are spent.
For more information and resources on PB, visit the participatory budgeting Facebook group”

San Francisco To Test Online Participatory Budgeting


Crunch.gov: “Taxpayers are sometimes the best people to decide how their money gets spent — sounds obvious, but usually we don’t have a direct say beyond who we elect. That’s changing for San Francisco residents.
It intends to be the first major US city to allow citizens to directly vote on portions of budget via the web. While details are still coming together, its plan is for each city district to vote on $100,000 in expenditures. Citizens will get to choose how the money is spent from a list of options, similar to the way they already vote from a list of ballot propositions. Topical experts will help San Francisco residents deliberate online.
So-called “participatory budgeting” first began in the festival city of Porto Alegre, Brazil, in 1989, and has slowly been expanding throughout the world. While major cities, such as Chicago and New York, have piloted participatory budgeting, they have not incorporated the modern features of digital voting and deliberation that are currently utilized in Brazil.
According to participatory budgeting expert and former White House technology fellow, Hollie Russon Gilman, San Francisco’s experiment will mark a “frontier” in American direct democracy.
This is significant because the Internet engenders a different type of democracy: not one of mere expression, but one of ideas. The net is good at surfacing the best ideas hidden within the wisdom of the crowds. Modern political scientists refer to this as “Epistemic Democracy,” derived from the Greek word for knowledge, epistēmē. Epistemic Democracy values citizens most for their expertise and builds tools to make policy making more informed.
For example, participatory budgeting has been found to reduce infant mortality rates in Brazil. It turns out that the mothers in Brazil had a better knowledge of why children were dying than health experts. Through participatory budgeting, they “channeled a larger fraction of their total budget to key investments in sanitation and health services,” writes Sonia Goncalves of King’s College London. “I also found that this change in the composition of municipal expenditures is associated with a pronounced reduction in the infant mortality rates for municipalities which adopted participatory budgeting.” [PDF]”

Open Access


Reports by the UK’s House of Commons, Business, Innovation and Skills Committee: “Open access refers to the immediate, online availability of peer reviewed research articles, free at the point of access (i.e. without subscription charges or paywalls). Open access relates to scholarly articles and related outputs. Open data (which is a separate area of Government policy and outside the scope of this inquiry) refers to the availability of the underlying research data itself. At the heart of the open access movement is the principle that publicly funded research should be publicly accessible. Open access expanded rapidly in the late twentieth century with the growth of the internet and digitisation (the transcription of data into a digital form), as it became possible to disseminate research findings more widely, quickly and cheaply.
Whilst there is widespread agreement that the transition to open access is essential in order to improve access to knowledge, there is a lack of consensus about the best route to achieve it. To achieve open access at scale in the UK, there will need to be a shift away from the dominant subscription-based business model. Inevitably, this will involve a transitional period and considerable change within the scholarly publishing market.
For the UK to transition to open access, an effective, functioning and competitive market in scholarly communications will be vital. The evidence we saw over the course of this inquiry shows that this is currently far from the case, with journal subscription prices rising at rates that are unsustainable for UK universities and other subscribers. There is a significant risk that the Government’s current open access policy will inadvertently encourage and prolong the dysfunctional elements of the scholarly publishing market, which are a major barrier to access.
See Volume I and  Volume II

Visualizing the legislative process with Sankey diagrams


Kamil Gregor at OpeningParliament.org: “The process of shaping the law often resembles an Indiana Jones maze. Bills and amendments run through an elaborate system of committees, sessions and hearings filled with booby traps before finally reaching the golden idol of a final approval.
Parliamentary monitoring organizations and researchers are often interested in how various pieces of legislation survive in this environment and what are the strategies to either kill or aid them. This specifically means answering two questions: What is the probability of a bill being approved and what factors determine this probability?
The legislative process is usually hierarchical: Successful completion of a step in the process is conditioned by completion of all previous steps. Therefore, we may also want to know the probabilities of completion in each consecutive step and their determinants.
A simple way how to give a satisfying answer to these questions without wandering into the land of nonlinear logistic regressions is the Sankey diagram. It is a famous flow chart in which a process is visualized using arrows. Relative quantities of outcomes in the process are represented by arrows’ widths.
A famous example is a Sankey diagram of Napoleon’s invasion of Russia. We can clearly see how the Grand Army was gradually shrinking as French soldiers were dying or defecting. Another well-known example is the Google Analytics flow chart. It shows how many visitors enter a webpage and then either leave or continue to a different page on the same website. As the number of consecutive steps increases, the number of visitors remaining on the website decreases.
The legislative process can be visualized in the same way. Progress of bills is represented by a stream between various steps in the process and width of the stream corresponds to quantities of bills. A bill can either complete all the steps of the process, or it can “drop out” of it at some point if it gets rejected.
Let’s take a look…”