New York City’s Digital Playbook


New York City Mayor Bill de Blasio: “…The New York City Digital Playbook outlines how we want residents to experience City services and how we will use digital tools to strengthen communities, online and off. The guidance within the Playbook will challenge all of our agencies and service providers to rethink the way they reach New Yorkers.

Our goal is to make our services more accessible, make our operations more transparent, and make it easy and fun to participate in government. In short — we aim to make New York the most user-friendly and innovative city in the world.

We believe that City government should be at New Yorkers’ fingertips and services should be just a swipe or a click away — just like so much of the technology in the rest of our lives. We also know that there are many people in New York City’s incomparable tech and design community who share this goal and want to lend expertise. So, another important goal of the Playbook is to make it easier for civically minded technologists to help us.

How to use the Playbook

This is an internal vision and strategy document that we will immediately begin to implement across government.

You may ask: if it’s an internal document, why share it publicly? A few reasons:

1. Transparency is a central tenet of our work;
2. This is a work in progress that we want to develop and update in the open.
3. We want to know how you think we can make it better.

The Playbook lives on nyc.gov/playbook.

Also, we’re giving a printed “strategy deck,” or set of cards, to staff across the city. Each card has a different principle or strategy printed on the front and key explanations and tips on the back. City leaders will use these cards to plan together and inspire each other when they’re designing new services, or thinking about how to make existing services better….(More)”

Is behavioural economics ready to save the world?


Book review by Trenton G Smith of Behavioral Economics and Public Health : “Modern medicine has long doled out helpful advice to ailing patients about not only drug treatments, but also diet, exercise, alcohol abuse, and many other lifestyle decisions. And for just as long, patients have been failing to follow doctors’ orders. Many of today’s most pressing public health problems would disappear if people would just make better choices.

Enter behavioural economics. A fairly recent offshoot of the dismal science, behavioural economics aims to take the coldly rational decision makers who normally populate economic theories, and instil in them a host of human foibles. Neoclassical (ie, conventional) economics, after all is the study of optimising behaviour in the presence of material constraints—why not add constraints on cognitive capacity, or self-control, or susceptibility to the formation of bad habits? The hope is that by incorporating insights from other behavioural sciences (most notably cognitive psychology and neuroscience) while retaining the methodological rigour of neoclassical economics, behavioural economics will yield a more richly descriptive theory of human behaviour, and generate new and important insights to better inform public policy.

Policy makers have taken notice. In an era in which free-market rhetoric dominates the political landscape, the idea that small changes to public health policies might serve to nudge consumers towards healthier behaviours holds great appeal. Even though some (irrational) consumers might be better off, the argument goes, if certain unhealthy food products were banned (or worse, taxed), this approach would infringe on the rights of the many consumers who want to indulge occasionally, and fully understand the consequences. If governments could instead use evidence from consumer science to make food labels more effective, or to improve the way that healthy foods are presented in school cafeterias, more politically unpalatable interventions in the marketplace might not be needed. This idea, dubbed “libertarian paternalism” by Richard Thaler and Cass Sunstein, has been pursued with gusto in both the UK (David Cameron’s Government formed the Behavioural Insights Team—unofficially described as the Nudge Unit) and the USA (where Sunstein spent time in the Obama administration’s Office of Information and Regulatory Affairs).

Whatever public health practitioners might think about these developments—or indeed, of economics as a discipline—this turn of events has rather suddenly given scholars at the cutting edge of consumer science an influential voice in the regulatory process, and some of the best and brightest have stepped up to contribute. Behavioral Economics & Public Health (edited by Christina Roberto and Ichiro Kawachi) is the product of a 2014 Harvard University exploratory workshop on applying social science insights to public health. As might be expected in a volume that aims to bring together two such inherently multidisciplinary fields, the book’s 11 chapters offer an eclectic mix of perspectives. The editors begin with an excellent overview of the field of behavioural economics and its applications to public health, and an economic perspective can also be found in four of the other chapters: Justin White and William Dow write about intertemporal choice, Kristina Lewis and Jason Block review the use of incentives to promote health, Michael Sanders and Michael Hallsworth describe their experience working within the UK’s Behavioural Insights Team, and Frederick Zimmerman concludes with a thoughtful critique of the field of behavioural economics. The other contributions are largely from the perspectives of psychology and marketing: Dennis Runger and Wendy Wood discuss habit formation, Rebecca Ferrer and colleagues emphasise the importance of emotion in decision making, Brent McFerran discusses social norms in the context of obesity, Jason Riis and Rebecca Ratner explain why some public health communication strategies are more effective than others, and Zoe Chance and colleagues and Brian Wansink offer frameworks for designing environments (eg, in schools and workplaces) that are conducive to healthy choices.

This collection of essays holds many hidden gems, but the one that surprised me the most was the attention given (by Runger and Wood briefly, and Zimmerman extensively) to a dirty little secret that behavioural economists rarely mention: once it is acknowledged that sometimes-irrational consumers can be manipulated into making healthy choices, it does not require much of a leap to conclude that business interests can—and do—use the same methods to push back in the other direction. This conclusion leads Zimmerman to a discussion of power in the marketplace and in our collective political economy, and to a call to action on these larger structural issues in society that neoclassical theory has long neglected….(More; Book)

‘Big data’ was supposed to fix education. It didn’t. It’s time for ‘small data’


Pasi Sahlberg and Jonathan Hasak in the Washington Post: “One thing that distinguishes schools in the United States from schools around the world is how data walls, which typically reflect standardized test results, decorate hallways and teacher lounges. Green, yellow, and red colors indicate levels of performance of students and classrooms. For serious reformers, this is the type of transparency that reveals more data about schools and is seen as part of the solution to how to conduct effective school improvement. These data sets, however, often don’t spark insight about teaching and learning in classrooms; they are based on analytics and statistics, not on emotions and relationships that drive learning in schools. They also report outputs and outcomes, not the impacts of learning on the lives and minds of learners….

If you are a leader of any modern education system, you probably care a lot about collecting, analyzing, storing, and communicating massive amounts of information about your schools, teachers, and students based on these data sets. This information is “big data,” a term that first appeared around 2000, which refers to data sets that are so large and complex that processing them by conventional data processing applications isn’t possible. Two decades ago, the type of data education management systems processed were input factors of education system, such as student enrollments, teacher characteristics, or education expenditures handled by education department’s statistical officer. Today, however, big data covers a range of indicators about teaching and learning processes, and increasingly reports on student achievement trends over time.

With the outpouring of data, international organizations continue to build regional and global data banks. Whether it’s the United Nations, the World Bank, the European Commission, or the Organization for Economic Cooperation and Development, today’s international reformers are collecting and handling more data about human development than before. Beyond government agencies, there are global education and consulting enterprises like Pearson and McKinsey that see business opportunities in big data markets.

Among the best known today is the OECD’s Program for International Student Assessment (PISA), which measures reading, mathematical, and scientific literacy of 15-year-olds around the world. OECD now also administers an Education GPS, or a global positioning system, that aims to tell policymakers where their education systems place in a global grid and how to move to desired destinations. OECD has clearly become a world leader in the big data movement in education.

Despite all this new information and benefits that come with it, there are clear handicaps in how big data has been used in education reforms. In fact, pundits and policymakers often forget that Big data, at best, only reveals correlations between variables in education, not causality. As any introduction to statistics course will tell you, correlation does not imply causation….
We believe that it is becoming evident that big data alone won’t be able to fix education systems. Decision-makers need to gain a better understanding of what good teaching is and how it leads to better learning in schools. This is where information about details, relationships and narratives in schools become important. These are what Martin Lindstrom calls “small data”: small clues that uncover huge trends. In education, these small clues are often hidden in the invisible fabric of schools. Understanding this fabric must become a priority for improving education.

To be sure, there is not one right way to gather small data in education. Perhaps the most important next step is to realize the limitations of current big data-driven policies and practices. Too strong reliance on externally collected data may be misleading in policy-making. This is an example of what small data look like in practice:

  • It reduces census-based national student assessments to the necessary minimum and transfer saved resources to enhance the quality of formative assessments in schools and teacher education on other alternative assessment methods. Evidence shows that formative and other school-based assessments are much more likely to improve quality of education than conventional standardized tests.
  • It strengthens collective autonomy of schools by giving teachers more independence from bureaucracy and investing in teamwork in schools. This would enhance social capital that is proved to be critical aspects of building trust within education and enhancing student learning.
  • It empowers students by involving them in assessing and reflecting their own learning and then incorporating that information into collective human judgment about teaching and learning (supported by national big data). Because there are different ways students can be smart in schools, no one way of measuring student achievement will reveal success. Students’ voices about their own growth may be those tiny clues that can uncover important trends of improving learning.

Edwards Deming once said that “without data you are another person with an opinion.” But Deming couldn’t have imagined the size and speed of data systems we have today….(More)”

Can Crowdsourcing Help Make Life Easier For People With Disabilities?


Sean Captain at FastCompany: “These days GPS technology can get you as close as about 10 feet from your destination, close enough to see it—assuming you can see.

But those last few feet are a chasm for the blind (and GPS accuracy sometimes falls only within about 30 feet).

“Actually finding the bus stop, not the right street, but standing in the right place when the bus comes, is pretty hard,” says Dave Power, president and CEO of the Perkins School for the Blind near Boston. Helen Keller’s alma mater is developing a mobile app that will provide audio directions—contributed by volunteers—so that blind people can get close enough to the stop for the bus driver to notice them.

Perkins’s app is one of 29 projects that recently received a total of $20 million in funding from Google.org’s Google Impact Challenge: Disabilities awards. Several of the winning initiatives rely on crowdsourced information to help the disabled—be they blind, in a wheelchair, or cognitively impaired. It’s a commonsense approach to tackling big logistical projects in a world full of people who have snippets of downtime during which they might perform bite-size acts of kindness online. But moving these projects from being just clever concepts to extensive services, based on the goodwill of volunteers, is going to be quite a hurdle.

People with limited mobility may have trouble traversing the last few feet between them and a wheelchair ramp, automatic doors, or other accommodations that aren’t easy to find (or may not even exist in some places).Wheelmap, based in Berlin, is trying to help by building online maps of accessible locations. Its website incorporates crowdsourced data. The site lets users type in a city and search for accessible amenities such as restaurants, hotels, and public transit.

Paris-based J’accede (which received 500,000 euros from Google, which is the equivalent of about $565,000) provides similar capabilities in both a website and an app, with a slicker design somewhat resembling TripAdvisor.

Both services have a long way to go. J’accede lists 374 accessible bars/restaurants in its hometown and a modest selection in other French cities like Marseille. “We still have a lot of work to do to cover France,” says J’accede’s president Damien Birambeau in an email. The goal is to go global though, and the site is available in English, German, and Spanish, in addition to French. Likewise, Wheelmap (which got 825,000 euros, or $933,000) performs best in the German capital of Berlin and cities like Hamburg, but is less useful in other places.

These sites face the same challenge as many other volunteer-based, crowdsourced projects: getting a big enough crowd to contribute information to the service. J’accede hopes to make the process easier. In June, it will connect itself with Google Places, so contributors will only need to supply details about accommodations at a site; information like the location’s address and phone number will be pulled in automatically. But both J’accede and Wheelmap recognize that crowdsourcing has its limits. They are now going beyond voluntary contributions, setting up automated systems to scrape information from other databases of accessible locations, such as those maintained by governments.

Wheelmap and J’accede are dwarfed by general-interest crowdsourced sites like TripAdvisor and Yelp, which offer some information about accessibility, too. For instance, among the many filters they offer users searching for restaurants—such as price range and cuisine type—TripAdvisor and Yelp both offer a Wheelchair Accessible checkbox. Applying that filter to Parisian establishments brings up about 1,000 restaurants on TripAdvisor and 2,800 in Yelp.

So what can Wheelmap and J’accede provide that the big players can’t? Details. “A person in a wheelchair, for example, will face different obstacles than a partially blind person or a person with cognitive disabilities,” says Birambeau. “These different needs and profiles means that we need highly detailed information about the accessibility of public places.”…(More)”

Hail the maintainers


Andrew Russell & Lee Vinsel at AEON: “The trajectory of ‘innovation’ from core, valued practice to slogan of dystopian societies, is not entirely surprising, at a certain level. There is a formulaic feel: a term gains popularity because it resonates with the zeitgeist, reaches buzzword status, then suffers from overexposure and cooptation. Right now, the formula has brought society to a question: after ‘innovation’ has been exposed as hucksterism, is there a better way to characterise relationships between society and technology?

There are three basic ways to answer that question. First, it is crucial to understand that technology is not innovation. Innovation is only a small piece of what happens with technology. This preoccupation with novelty is unfortunate because it fails to account for technologies in widespread use, and it obscures how many of the things around us are quite old. In his book, Shock of the Old (2007), the historian David Edgerton examines technology-in-use. He finds that common objects, like the electric fan and many parts of the automobile, have been virtually unchanged for a century or more. When we take this broader perspective, we can tell different stories with drastically different geographical, chronological, and sociological emphases. The stalest innovation stories focus on well-to-do white guys sitting in garages in a small region of California, but human beings in the Global South live with technologies too. Which ones? Where do they come from? How are they produced, used, repaired? Yes, novel objects preoccupy the privileged, and can generate huge profits. But the most remarkable tales of cunning, effort, and care that people direct toward technologies exist far beyond the same old anecdotes about invention and innovation.

Second, by dropping innovation, we can recognise the essential role of basic infrastructures. ‘Infrastructure’ is a most unglamorous term, the type of word that would have vanished from our lexicon long ago if it didn’t point to something of immense social importance. Remarkably, in 2015 ‘infrastructure’ came to the fore of conversations in many walks of American life. In the wake of a fatal Amtrak crash near Philadelphia, President Obama wrestled with Congress to pass an infrastructure bill that Republicans had been blocking, but finally approved in December 2015. ‘Infrastructure’ also became the focus of scholarly communities in history and anthropology, even appearing 78 times on the programme of the annual meeting of the American Anthropological Association. Artists, journalists, and even comedians joined the fray, most memorably with John Oliver’s hilarious sketch starring Edward Norton and Steve Buscemi in a trailer for an imaginary blockbuster on the dullest of subjects. By early 2016, the New York Review of Books brought the ‘earnest and passive word’ to the attention of its readers, with a depressing essay titled ‘A Country Breaking Down’.

Despite recurring fantasies about the end of work, the central fact of our industrial civilisation is labour, most of which falls far outside the realm of innovation

The best of these conversations about infrastructure move away from narrow technical matters to engage deeper moral implications. Infrastructure failures – train crashes, bridge failures, urban flooding, and so on – are manifestations of and allegories for America’s dysfunctional political system, its frayed social safety net, and its enduring fascination with flashy, shiny, trivial things. But, especially in some corners of the academic world, a focus on the material structures of everyday life can take a bizarre turn, as exemplified in work that grants ‘agency’ to material things or wraps commodity fetishism in the language of high cultural theory, slick marketing, and design. For example, Bloomsbury’s ‘Object Lessons’ series features biographies of and philosophical reflections on human-built things, like the golf ball. What a shame it would be if American society matured to the point where the shallowness of the innovation concept became clear, but the most prominent response was an equally superficial fascination with golf balls, refrigerators, and remote controls.

Third, focusing on infrastructure or on old, existing things rather than novel ones reminds us of the absolute centrality of the work that goes into keeping the entire world going…..

 

We organised a conference to bring the work of the maintainers into clearer focus. More than 40 scholars answered a call for papers asking, ‘What is at stake if we move scholarship away from innovation and toward maintenance?’ Historians, social scientists, economists, business scholars, artists, and activists responded. They all want to talk about technology outside of innovation’s shadow.

One important topic of conversation is the danger of moving too triumphantly from innovation to maintenance. There is no point in keeping the practice of hero-worship that merely changes the cast of heroes without confronting some of the deeper problems underlying the innovation obsession. One of the most significant problems is the male-dominated culture of technology, manifest in recent embarrassments such as the flagrant misogyny in the ‘#GamerGate’ row a couple of years ago, as well as the persistent pay gap between men and women doing the same work.

There is an urgent need to reckon more squarely and honestly with our machines and ourselves. Ultimately, emphasising maintenance involves moving from buzzwords to values, and from means to ends. In formal economic terms, ‘innovation’ involves the diffusion of new things and practices. The term is completely agnostic about whether these things and practices are good. Crack cocaine, for example, was a highly innovative product in the 1980s, which involved a great deal of entrepreneurship (called ‘dealing’) and generated lots of revenue. Innovation! Entrepreneurship! Perhaps this point is cynical, but it draws our attention to a perverse reality: contemporary discourse treats innovation as a positive value in itself, when it is not.

Entire societies have come to talk about innovation as if it were an inherently desirable value, like love, fraternity, courage, beauty, dignity, or responsibility. Innovation-speak worships at the altar of change, but it rarely asks who benefits, to what end? A focus on maintenance provides opportunities to ask questions about what we really want out of technologies. What do we really care about? What kind of society do we want to live in? Will this help get us there? We must shift from means, including the technologies that underpin our everyday actions, to ends, including the many kinds of social beneficence and improvement that technology can offer. Our increasingly unequal and fearful world would be grateful….(More)”

Citizen scientists aid Ecuador earthquake relief


Mark Zastrow at Nature: “After a magnitude-7.8 earthquake struck Ecuador’s Pacific coast on 16 April, a new ally joined the international relief effort: a citizen-science network called Zooniverse.

On 25 April, Zooniverse launched a website that asks volunteers to analyse rapidly-snapped satellite imagery of the disaster, which led to more than 650 reported deaths and 16,000 injuries. The aim is to help relief workers on the ground to find the most heavily damaged regions and identify which roads are passable.

Several crisis-mapping programmes with thousands of volunteers already exist — but it can take days to train satellites on the damaged region and to transmit data to humanitarian organizations, and results have not always proven useful. The Ecuador quake marked the first live public test for an effort dubbed the Planetary Response Network (PRN), which promises to be both more nimble than previous efforts, and to use more rigorous machine-learning algorithms to evaluate the quality of crowd-sourced analyses.

The network relies on imagery from the satellite company Planet Labs in San Francisco, California, which uses an array of shoebox-sized satellites to map the planet. In order to speed up the crowd-sourced process, it uses the Zooniverse platform to distribute the tasks of spotting features in satellite images. Machine-learning algorithms employed by a team at the University of Oxford, UK, then classify the reliability of each volunteer’s analysis and weight their contributions accordingly.

Rapid-fire data

Within 2 hours of the Ecuador test project going live with a first set of 1,300 images, each photo had been checked at least 20 times. “It was one of the fastest responses I’ve seen,” says Brooke Simmons, an astronomer at the University of California, San Diego, who leads the image processing. Steven Reece, who heads the Oxford team’s machine-learning effort, says that results — a “heat map” of damage with possible road blockages — were ready in another two hours.

In all, more than 2,800 Zooniverse users contributed to analysing roughly 25,000 square kilometres of imagery centred around the coastal cities of Pedernales and Bahia de Caraquez. That is where the London-based relief organization Rescue Global — which requested the analysis the day after the earthquake — currently has relief teams on the ground, including search dogs and medical units….(More)”

From Stalemate to Solutions


Karen Abrams Gerber & Andrea Jacobs  at Stanford Social Innovation Review: “….We waste time asking, “How can we change the way people think?” when we should be asking, “How do we change the way we do things?”

Changing how we do things isn’t just about reworking laws, policies, and systems; it means rethinking the very act of problem-solving. We believe there are five basic tenets to successful collaboration:

  1. Engaging unlikely bedfellows
  2. Creating a resonant vision
  3. Cultivating relationships
  4. Communicating across worldviews
  5. Committing to ongoing learning

Over the past two years, we’ve researched an organization that embodies all of these: Convergence Center for Policy Resolution “convenes people and groups with conflicting views to build trust, identify solutions, and form alliances for action on critical national issues.” Its projects include reimagining K-12 education, addressing economic mobility and poverty, reforming the federal budget process, financing long-term care, and improving the dietary choices and wellness of Americans.

The organization’s unique approach to collaboration enables adversaries to work together and develop breakthrough solutions. It starts with targeting and framing an issue, and then enrolling a wide spectrum of stakeholders. Over an extended period of time, these stakeholders attend a series of expertly facilitated meetings to explore the issue and identify solutions, and finally take joint action….

Foundational to Convergence’s success is the principle of engaging unlikely bedfellows. Stakeholder diversity helps eliminate the “echo chamber” effect (also described by Witter and Mikulsky) created when like-minded groups talk only with one another. The organization vets potential stakeholders to determine their capacity for working with the tensions and complexities of diverse perspectives and their willingness to participate in an emergent process, believing that each ideological camp holds a crucial piece of the puzzle and that the tension of differing views actually creates better solutions.

Convergence exemplifies the power of creating a resonant vision in its approach to tackling big social issues. Framing the issue in a way that galvanizes all stakeholders takes tremendous time, energy, and skill. For example, when the organization decided to focus on addressing K-12 education in the United States, it engaged in hundreds of interviews to identify the best way to frame the project. While everyone agreed the system did not serve the needs of many students, they had difficulty finding consensus about how to move forward. One stakeholder commented that the current system was based on a 19th-century factory model that could never meet the needs of 21st-century students. This comment sparked a new narrative that excited stakeholders across the ideological spectrum: “reimagining education for the 21st century!”

It’s important to note that Convergence focuses on framing the problem, not formulating the solution(s). Rather, it believes the solution emerges through the process of authentic collaboration. This differs significantly from an advocacy-based approach, in which a group agrees on a solution and then mobilizes as much support for that solution as possible. As a result, solutions created through Convergence’s collaborative approach are better able to weather the resistance that all change efforts face, because some of that resistance is built into the process.

Change takes time, and so does cultivating relationships. In an article last year, Jane Wei-Skillern, David Ehrlichman, and David Sawyer wrote, “The single most important factor behind all successful collaborations is trust-based relationships among participants.”…..

Change is complex and certainly not linear. Convergence’s approach “lives” this complexity and uncertainty. In its own words, the organization is “building the ship while sailing it.” Its success is due in part to actively and simultaneously engaging each of the five tenets of authentic collaboration, and its work demonstrates the powerful possibilities of authentic collaboration at a time when partisan rancor and stalemate feel inevitable. It proves we can change the world—collaboratively—without anyone relinquishing their core values….(More)”

Open Data Supply: Enriching the usability of information


Report by Phoensight: “With the emergence of increasing computational power, high cloud storage capacity and big data comes an eager anticipation of one of the biggest IT transformations of our society today.

Open data has an instrumental role to play in our digital revolution by creating unprecedented opportunities for governments and businesses to leverage off previously unavailable information to strengthen their analytics and decision making for new client experiences. Whilst virtually every business recognises the value of data and the importance of the analytics built on it, the ability to realise the potential for maximising revenue and cost savings is not straightforward. The discovery of valuable insights often involves the acquisition of new data and an understanding of it. As we move towards an increasing supply of open data, technological and other entrepreneurs will look to better utilise government information for improved productivity.

This report uses a data-centric approach to examine the usability of information by considering ways in which open data could better facilitate data-driven innovations and further boost our economy. It assesses the state of open data today and suggests ways in which data providers could supply open data to optimise its use. A number of useful measures of information usability such as accessibility, quantity, quality and openness are presented which together contribute to the Open Data Usability Index (ODUI). For the first time, a comprehensive assessment of open data usability has been developed and is expected to be a critical step in taking the open data agenda to the next level.

With over two million government datasets assessed against the open data usability framework and models developed to link entire country’s datasets to key industry sectors, never before has such an extensive analysis been undertaken. Government open data across Australia, Canada, Singapore, the United Kingdom and the United States reveal that most countries have the capacity for improvements in their information usability. It was found that for 2015 the United Kingdom led the way followed by Canada, Singapore, the United States and Australia. The global potential of government open data is expected to reach 20 exabytes by 2020, provided governments are able to release as much data as possible within legislative constraints….(More)”

7 projects that state and local governments can reuse


Melody Kramer at 18F: “We’re starting to see state and local governments adapt or use 18F products or tools. Nothing could make us happier; all of our code (and content) is available for anyone to use and reusable.

There are a number of open source projects that 18F has worked on that could work particularly well at any level of government. We’re highlighting seven below:

Public website analytics

A screen shot of the City of Boulder's analytics dashboard

We worked with the Digital Analytics Program, the U.S. Digital Service (USDS), and the White House to build and host a dashboard showing real-time U.S. federal government web traffic. This helps staff and the public learn about how people use government websites. The dashboard itself is open source and can be adapted for a state or local government. We recently interviewed folks from Philadelphia, Boulder, and the state of Tennessee about how they’ve adapted the analytics dashboard for their own use.

Quick mini-sites for content

A screen shot of an 18F guide on the pages platform

We built a responsive, accessible website template (based on open source work by the Consumer Financial Protection Bureau) that we use primarily for documentation and guides. You can take the website template, adapt the colors and fonts to reflect your own style template, and have an easy way to release notes about a project. We’ve used this template to write a guide on accessibility in government, content guidelines, and a checklist for what needs to take place before we release software. You’re also welcome to take our content and adapt it for your own needs — what we write is in the public domain.

Insight into how people interact with government

People depend on others (for example, family members, friends, and public library staff) for help with government websites, but government services are not set up to support this type of assistance.

Over the last several months, staff from General Service Administration’s USAGov and 18F teams have been talking to Americans around the country about their interactions with the federal government. The goal of the research was to identify and create cross-agency services and resources to improve how the government interacts with the public. Earlier this month, we published all of our research. You can read the full report with findings or explore what we learned on the 18F blog.

Market research for procurement

We developed a tool that helps you easily conduct market research across a number of categories for acquiring professional labor. You can read about how the city of Boston is using the tool to conduct market research.

Vocabulary for user-centered design

We released a deck of method cards that help research and design teams communicate a shared vocabulary across teams and agencies.

Task management

We recently developed a checklist program that help users manage complex to-do lists. One feature: checklist items deadlines can be set according to a fixed date or relative to completion of other items. This means you can create checklist for all new employees, for example, and say “Task five should be completed four days after task four,” whenever task four is completed by an employee.

Help small businesses find opportunities

FBOpen is a set of open source tools to help small businesses search for opportunities to work with the U.S. government. FBOpen presents an Application Programming Interface (API) to published Federal contracting opportunities, as well as implementing a beautiful graphical user interface to the same opportunities.

Anyone who wishes to may reuse this code to create their own website, free of charge and unencumbered by obligations….(More)”

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”