DIY ‘Public Service Design’ manual


The Spider Project: “Service design is a method for inventing or improving services. It is an interdisciplinary method that makes use of ‘design thinking’. Service design helps with designing services from the perspective of the user.

Not by guessing what these users might want, but by truly co-creating relevant, effective and efficient services in collaboration with them. The basic principles of service design are that the designed service should be user- friendly and desired, and must respond to the needs and motivations of customers and citizens.

This manual guides civil servants in tendering, evaluating and managing, and shows the added value of design professionals when bringing their skills, knowledge and experience to the table.

This practical guide is filled with examples and case studies that will enable public organisations to obtain enough insights and confidence in service design in order to start working with it themselves.

Download a copy of Public Service Design

Opening City Hall’s Wallets to Innovation


Tina Rosenberg at the New York Times: “Six years ago, the city of San Francisco decided to upgrade its streetlights. This is its story: O.K., stop. This is a parody, right? Government procurement is surely too nerdy even for Fixes. Procurement is a clerical task that cities do on autopilot: Decide what you need. Write a mind-numbing couple of dozen pages of specifications. Collect a few bids from the usual suspects. Yep, that’s procurement.But it doesn’t have to be. Instead of a rote purchasing exercise, what if procurement could be a way for cities to find new approaches to their problems?….

“Instead of saying to the marketplace ‘here’s the solution we want,’ we said ‘here’s the challenge, here’s the problem we’re having’,” said Barbara Hale, assistant general manager of the city’s Public Utilities Commission. “That opened us up to what other people thought the solution to the problem was, rather than us in our own little world deciding we knew the answer.”

The city got 59 different ideas from businesses in numerous countries. A Swiss company called Paradox won an agreement to do a 12-streetlight pilot test.

So — a happy ending for the scrappy and innovative Paradox? No. Paradox’s system worked, but the city could not award a contract for 18,500 streetlights that way. It held another competition for just the control systems, and tried out three of them. Last year the city issued a traditional R.F.P., using what it learned from the pilots. The contract has not yet been awarded.

Dozens of cities around the world are using problem-based procurement.   Barcelona has posed six challenges that it will spend a million euros on, and Moscow announced last year that five percent of city spending would be set aside for innovative procurement. But in the vast majority of cities, as in San Francisco, problem-based procurement is still just for small pilot projects — a novelty.

It will grow, however. This is largely because of the efforts ofCityMart, a company based in New York and Barcelona that has almost single-handedly taken the concept from a neat idea to something cities all over want to figure out how to do.

The concept is new enough that there’s not yet a lot of evidence about its effects. There’s plenty of proof, however, of the deficiencies of business-as-usual.

With the typical R.F.P., a city uses a consultant, working with local officials, to design what to ask for. Then city engineers and lawyers write the specifications, and the R.F.P. goes out for bids.

“If it’s a road safety issue it’s likely it will be the traffic engineers who will be asked to tell you what you can do, what you should invest in,” said Sascha Haselmayer, CityMart’s chief executive. “They tend to come up with things like traffic lights. They do not know there’s a world of entrepreneurs who work on educating drivers better, or that have a different design approach to public space — things that may not fit into the professional profile of the consultant.”

Such a process is guaranteed to be innovation-free. Innovation is far more likely when expertise from one discipline is applied to another. If you want the most creative solution to a traffic problem, ask people who aren’t traffic engineers.

The R.F.P. process itself was designed to give anyone a shot at a contract, but in reality, the winners almost always come from a small group of businesses with the required financial stability, legal know-how to negotiate the bureaucracy, and connections. Put those together, and cities get to consider only a tiny spectrum of the possible solutions to their problems.

Problem-based procurement can provide them with a whole rainbow. But to do that, the process needs clearinghouses — eBays or Craigslists for urban ideas….(More)”

Ethical, Safe, and Effective Digital Data Use in Civil Society


Blog by Lucy Bernholz, Rob Reich, Emma Saunders-Hastings, and Emma Leeds Armstrong: “How do we use digital data ethically, safely, and effectively in civil society. We have developed three early principles for consideration:

  • Default to person-centered consent.
  • Prioritize privacy and minimum viable data collection.
  • Plan from the beginning to open (share) your work.

This post provides a synthesis from a one day workshop that informed these principles. It concludes with links to draft guidelines you can use to inform partnerships between data consultants/volunteers and nonprofit organizations….(More)

These three values — consent, minimum viable data collection, and open sharing- comprise a basic framework for ethical, safe, and effective use of digital data by civil society organizations. They should be integrated into partnerships with data intermediaries and, perhaps, into general data practices in civil society.

We developed two tools to guide conversations between data volunteers and/or consultants and nonprofits. These are downloadable below. Please use them, share them, improve them, and share them again….

  1. Checklist for NGOs and external data consultants
  2. Guidelines for NGOs and external data consultants (More)”

Smoke Signals: Open data & analytics for preventing fire deaths


Enigma: “Today we are launching Smoke Signals, an open source civic analytics tool that helps local communities determine which city blocks are at the highest risk of not having a smoke alarm.

25,000 people are killed or injured in 1 million fires across the United States each year. With over 130 million housing units across the country, 4.5 million of them do not have smoke detectors, placing their inhabitants at substantial risk. Driving this number down is the single most important factor for saving lives put at risk by fire.

Organizations like the Red Cross are investing a lot of resources to buy and install smoke alarms in people’s homes. But a big challenge remains: in a city of millions, what doors should you knock on first when conducting an outreach effort?

We began working on the problem of targeting the blocks at highest risk of not having a smoke alarm with the City of New Orleans last spring. (You can read about this work here.) Over the past few months, with collaboration from the Red Cross and DataKind, we’ve built out a generalized model and a set of tools to offer the same analytics potential to 178 American cities, all in a way that is simple to use and sensitive to how on-the-ground operations are organized.

We believe that Smoke Signals is more a collection of tools and collaborations than it is a slick piece of software that can somehow act as a panacea to the problem of fire fatalities. Core to its purpose and mission are a set of commitments:

  • an ongoing collaboration with the Red Cross wherein our smoke alarm work informs their on-the-ground outreach
  • a collaboration with DataKind to continue applying volunteer work to the improvement of the underlying models and data that drive the risk analysis
  • a working relationship with major American cities to help integrate our prediction models into their outreach programs

and tools:

  • a downloadable CSV for 178 American municipalities that associate city streets to risk scores
  • an interactive map for an immediate bird’s eye assessment of at-risk city blocks
  • an API endpoint to which users can upload a CSV of local fire incidents in order to improve scores for their area

We believe this is an important contribution to public safety and the better delivery of government services. However, we also consider it a work in progress, a demonstration of how civic analytic solutions can be shared and generalized across the country. We are open sourcing all of the components that went into it and invite anyone with an interest in making it better to get involved….(More)”

Research on digital identity ecosystems


Francesca Bria et al at NESTA/D-CENT: “This report presents a concrete analysis of the latest evolution of the identity ecosystem in the big data context, focusing on the economic and social value of data and identity within the current digital economy. This report also outlines economic, policy, and technical alternatives to develop an identity ecosystem and management of data for the common good that respects citizens’ rights, privacy and data protection.

Key findings

  • This study presents a review of the concept of identity and a map of the key players in the identity industry (such as data brokers and data aggregators), including empirical case studies of identity management in key sectors.
    ….
  • The “datafication” of individuals’ social lives, thoughts and moves is a valuable commodity and constitutes the backbone of the “identity market” within which “data brokers” (collectors, purchasers or sellers) play key different roles in creating the market by offering various services such as fraud, customer relation, predictive analytics, marketing and advertising.
  • Economic, political and technical alternatives for identity to preserve trust, privacy and data ownership in today’s big data environments are formulated. The report looks into access to data, economic strategies to manage data as commons, consent and licensing, tools to control data, and terms of services. It also looks into policy strategies such as privacy and data protection by design and trust and ethical frameworks. Finally, it assesses technical implementations looking at identity and anonymity, cryptographic tools; security; decentralisation and blockchains. It also analyses the future steps needed in order to move into the suggested technical strategies….(More)”

Data Collaboratives: Sharing Public Data in Private Hands for Social Good


Beth Simone Noveck (The GovLab) in Forbes: “Sensor-rich consumer electronics such as mobile phones, wearable devices, commercial cameras and even cars are collecting zettabytes of data about the environment and about us. According to one McKinsey study, the volume of data is growing at fifty percent a year. No one needs convincing that these private storehouses of information represent a goldmine for business, but these data can do double duty as rich social assets—if they are shared wisely.

Think about a couple of recent examples: Sharing data held by businesses and corporations (i.e. public data in private hands) can help to improve policy interventions. California planners make water allocation decisions based upon expertise, data and analytical tools from public and private sources, including Intel, the Earth Research Institute at the University of California at Santa Barbara, and the World Food Center at the University of California at Davis.

In Europe, several phone companies have made anonymized datasets available, making it possible for researchers to track calling and commuting patterns and gain better insight into social problems from unemployment to mental health. In the United States, LinkedIn is providing free data about demand for IT jobs in different markets which, when combined with open data from the Department of Labor, helps communities target efforts around training….

Despite the promise of data sharing, these kind of data collaboratives remain relatively new. There is a need toaccelerate their use by giving companies strong tax incentives for sharing data for public good. There’s a need for more study to identify models for data sharing in ways that respect personal privacy and security and enable companies to do well by doing good. My colleagues at The GovLab together with UN Global Pulse and the University of Leiden, for example, published this initial analysis of terms and conditions used when exchanging data as part of a prize-backed challenge. We also need philanthropy to start putting money into “meta research;” it’s not going to be enough to just open up databases: we need to know if the data is good.

After years of growing disenchantment with closed-door institutions, the push for greater use of data in governing can be seen as both a response and as a mirror to the Big Data revolution in business. Although more than 1,000,000 government datasets about everything from air quality to farmers markets are openly available online in downloadable formats, much of the data about environmental, biometric, epidemiological, and physical conditions rest in private hands. Governing better requires a new empiricism for developing solutions together. That will depend on access to these private, not just public data….(More)”

This free online encyclopedia has achieved what Wikipedia can only dream of


Nikhil Sonnad at Quartz: “The Stanford Encyclopedia of Philosophy may be the most interesting website on the internet. Not because of the content—which includes fascinating entries on everything from ambiguity to zombies—but because of the site itself.

Its creators have solved one of the internet’s fundamental problems: How to provide authoritative, rigorously accurate knowledge, at no cost to readers. It’s something the encyclopedia, or SEP, has managed to do for two decades.

The internet is an information landfill. Somewhere in it—buried under piles of opinion, speculation, and misinformation—is virtually all of human knowledge. The story of the SEP shows that it is possible to create a less trashy internet.  But sorting through the trash is difficult work. Even when you have something you think is valuable, it often turns out to be a cheap knock-off.

The story of how the SEP is run, and how it came to be, shows that it is possible to create a less trashy internet—or at least a less trashy corner of it. A place where actual knowledge is sorted into a neat, separate pile instead of being thrown into the landfill. Where the world can go to learn everything that we know to be true. Something that would make humans a lot smarter than the internet we have today.

The impossible trinity of information

The online SEP has humble beginnings. Edward Zalta, a philosopher at Stanford’s Center for the Study of Language and Information, launched it way back in September 1995, with just two entries.

Philosophizing, pre-internet.(Flickr/Erik Drost—CC-BY-2.0)

That makes it positively ancient in internet years. Even Wikipedia is only 14. ….

John Perry, the director of the center, was the one who first suggested a dictionary of philosophical terms. But Zalta had bigger ideas. He and two co-authors later described the challenge in a 2002 paper (pdf, p. 1):

A fundamental problem faced by the general public and the members of an academic discipline in the information age is how to find the most authoritative, comprehensive, and up-to-date information about an important topic.

That paper is so old that it mentions “CD-ROMs” in the second sentence. But for all the years that have passed, the basic problem remains unsolved.  The requirements are an “impossible trinity”—like having your cake, eating it, and then bringing it to another party. The three requirements the authors list—”authoritative, comprehensive, and up-to-date”—are to information what the “impossible trinity” is to economics. You can only ever have one or two at once. It is like having your cake, eating it, and then bringing it to another party.

Yet if the goal is to share with people what is true, it is extremely important for a resource to have all of these things. It must be trusted. It must not leave anything out. And it must reflect the latest state of knowledge. Unfortunately, all of the other current ways of designing an encyclopedia very badly fail to meet at least one of these requirements.

Where other encyclopedias fall short

Book

Authoritative: √

Comprehensive: X

Up-to-date: X

Printed encyclopedias: still a thing(Princeton University Press)

Printed books are authoritative: Readers trust articles they know have been written and edited by experts. Books also produce a coherent overview of a subject, as the editors consider how each entry fits into the whole. But they become obsolete whenever new research comes out. Nor can a book (or even a set of volumes) be comprehensive, except perhaps for a very narrow discipline; there’s simply too much to print.

Crowdsourcing

Authoritative: X

Comprehensive: X

Up-to-date: √

A crowdsourced online encyclopedia has the virtue of timeliness. Thanks to Wikipedia’s vibrant community of non-experts, its entries on breaking-news events are often updated as they happen. But except perhaps in a few areas in which enough well-informed people care for errors to get weeded out, Wikipedia is not authoritative.  Basic mathematics entries on Wikipedia were a “a hot mess of error, arrogance, obscurity, and nonsense.”  One math professor reviewed basic mathematics entries and found them to be a “a hot mess of error, arrogance, obscurity, and nonsense.” Nor is it comprehensive: Though it has nearly 5 million articles in the English-language version alone, seemingly in every sphere of knowledge, fewer than 10,000 are “A-class” or better, the status awarded to articles considered “essentially complete.”

Speaking of holes, the SEP has a rather detailed entry on the topic of holes, and it rather nicely illustrates one of Wikipedia’s key shortcomings. Holes present a tricky philosophical problem, the SEP entry explains: A hole is nothing, but we refer to it as if it were something. (Achille Varzi, the author of the holes entry, was called upon in the US presidential election in 2000 toweigh in on the existential status of hanging chads.) If you ask Wikipedia for holes it gives you the young-adult novel Holes and the band Hole.

In other words, holes as philosophical notions are too abstract for a crowdsourced venue that favors clean, factual statements like a novel’s plot or a band’s discography. Wikipedia’s bottom-up model could never produce an entry on holes like the SEP’s.

Crowdsourcing + voting

Authoritative: ?

Comprehensive: X

Up-to-date: ?

A variation on the wiki model is question-and-answer sites like Quora (general interest) and StackOverflow (computer programming), on which users can pose questions and write answers. These are slightly more authoritative than Wikipedia, because users also vote answers up or down according to how helpful they find them; and because answers are given by single, specific users, who are encouraged to say why they’re qualified (“I’m a UI designer at Google,” say).

But while there are sometimes ways to check people’s accreditation, it’s largely self-reported and unverified. Moreover, these sites are far from comprehensive. Any given answer is only as complete as its writer decides or is able to make it. And the questions asked and answered tend to reflect the interests of the sites’ users, which in both Quora and StackOverflow’s cases skew heavily male, American, and techie.

Moreover, the sites aren’t up-to-date. While they may respond quickly to new events, answers that become outdated aren’t deleted or changed but stay there, burdening the site with a growing mass of stale information.

The Stanford solution

So is the impossible trinity just that—impossible? Not according to Zalta. He imagined a different model for the SEP: the “dynamic reference work.”

Dynamic reference work

Authoritative: √

Comprehensive: √

Up-to-date: √

To achieve authority, several dozen subject editors—responsible for broad areas like “ancient philosophy” or “formal epistemology”—identify topics in need of coverage, and invite qualified philosophers to write entries on them. If the invitation is accepted, the author sends an outline to the relevant subject editors.

 This is not somebody randomly deciding to answer a question on Quora. “An editor works with the author to get an optimal outline before the author begins to write,” says Susanna Siegel, subject editor for philosophy of mind. “Sometimes there is a lot of back and forth at this stage.” Editors may also reject entries. Zalta and Uri Nodelman, the SEP’s senior editor, say that this almost never happens. In the rare cases when it does, the reason is usually that an entry is overly biased. In short, this is not somebody randomly deciding to answer a question on Quora.

An executive editorial board—Zalta, Nodelman, and Colin Allen—works to make the SEP comprehensive….(More)”

Routledge International Handbook of Ignorance Studies


Book edited by Matthias Gross and Linsey McGoey: “Once treated as the absence of knowledge, ignorance today has become a highly influential topic in its own right, commanding growing attention across the natural and social sciences where a wide range of scholars have begun to explore the social life and political issues involved in the distribution and strategic use of not knowing. The field is growing fast and this handbook reflects this interdisciplinary field of study by drawing contributions from economics, sociology, history, philosophy, cultural studies, anthropology, feminist studies, and related fields in order to serve as a seminal guide to the political, legal and social uses of ignorance in social and political life….(More)”

Why interdisciplinary research matters


Special issue of Nature: “To solve the grand challenges facing society — energy, water, climate, food, health — scientists and social scientists must work together. But research that transcends conventional academic boundaries is harder to fund, do, review and publish — and those who attempt it struggle for recognition and advancement (see World View, page 291). This special issue examines what governments, funders, journals, universities and academics must do to make interdisciplinary work a joy rather than a curse.

A News Feature on page 308 asks where the modern trend for interdisciplinary research came from — and finds answers in the proliferation of disciplines in the twentieth century, followed by increasingly urgent calls to bridge them. An analysis of publishing data explores which fields and countries are embracing interdisciplinary research the most, and what impact such research has (page 306). Onpage 313, Rick Rylance, head of Research Councils UK and himself a researcher with one foot in literature and one in neuroscience, explains why interdisciplinarity will be the focus of a 2015–16 report from the Global Research Council. Around the world, government funding agencies want to know what it is, whether they should they invest in it, whether they are doing so effectively and, if not, what must change.

How can scientists successfully pursue research outside their comfort zone? Some answers come from Rebekah Brown, director of Monash University’s Monash Sustainability Institute in Melbourne, Australia, and her colleagues. They set out five principles for successful interdisciplinary working that they have distilled from years of encouraging researchers of many stripes to seek sustainability solutions (page 315). Similar ideas help scientists, curators and humanities scholars to work together on a collection that includes clay tablets, papyri, manuscripts and e-mail archives at the John Rylands Research Institute in Manchester, UK, reveals its director, Peter Pormann, on page 318.

Finally, on page 319, Clare Pettitt reassesses the multidisciplinary legacy of Richard Francis Burton — Victorian explorer, ethnographer, linguist and enthusiastic amateur natural scientist who got some things very wrong, but contributed vastly to knowledge of other cultures and continents. Today’s would-be interdisciplinary scientists can draw many lessons from those of the past — and can take our polymathy quiz online at nature.com/inter. (Nature special:Interdisciplinarity)

Openness an Essential Building Block for Inclusive Societies


 (Mexico) in the Huffington Post: “The international community faces a complex environment that requires transforming the way we govern. In that sense, 2015 marks a historic milestone, as 193 Member States of the United Nations will come together to agree on the adoption of the 2030 Agenda. With the definition of the 17 Sustainable Development Goals (SDGs), we will set an ambitious course toward a better and more inclusive world for the next 15 years.

The SDGs will be established just when governments deal with new and more defiant challenges, which require increased collaboration with multiple stakeholders to deliver innovative solutions. For that reason, cutting-edge technologies, fueled by vast amounts of data, provide an efficient platform to foster a global transformation and consolidate more responsive, collaborative and open governments.

Goal 16 seeks to promote just, peaceful and inclusive societies by ensuring access to public information, strengthening the rule of law, as well as building stronger and more accountable institutions. By doing so, we will contribute to successfully achieve the rest of the 2030 Agenda objectives.

During the 70th United Nations General Assembly, the 11 countries of the Steering Committee of the Open Government Partnership (OGP), along with civil-society leaders, will gather to acknowledge Goal 16 as a common target through a Joint Declaration: Open Government for the Implementation of the 2030 Agenda for Sustainable Development. As the Global Summit of OGP convenes this year in Mexico City, on October 28th and 29th, my government will call on all 65 members to subscribe to this fundamental declaration.

The SDGs will be reached only through trustworthy, effective and inclusive institutions. This is why Mexico, as current chair of the OGP, has committed to promote citizen participation, innovative policies, transparency and accountability.

Furthermore, we have worked with a global community of key players to develop the international Open Data Charter (ODC), which sets the founding principles for a greater coherence and increased use of open data across the world. We seek to recognize the value of having timely, comprehensive, accessible, and comparable data to improve governance and citizen engagement, as well as to foster inclusive development and innovation….(More)”