Mission Control: A History of the Urban Dashboard


Futuristic control rooms have proliferated in dozens of global cities. Baltimore has its CitiStat Room, where department heads stand at a podium before a wall of screens and account for their units’ performance.  The Mayor’s office in London’s City Hall features a 4×3 array of iPads mounted in a wooden panel, which seems an almost parodic, Terry Gilliam-esque take on the Brazilian Ops Center. Meanwhile, British Prime Minister David Cameron commissioned an iPad app – the “No. 10 Dashboard” (a reference to his residence at 10 Downing Street) – which gives him access to financial, housing, employment, and public opinion data. As The Guardian reported, “the prime minister said that he could run government remotely from his smartphone.”

This is the age of Dashboard Governance, heralded by gurus like Stephen Few, founder of the “visual business intelligence” and “sensemaking” consultancy Perceptual Edge, who defines the dashboard as a “visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.” A well-designed dashboard, he says — one that makes proper use of bullet graphs, sparklines, and other visualization techniques informed by the “brain science” of aesthetics and cognition — can afford its users not only a perceptual edge, but a performance edge, too. The ideal display offers a big-picture view of what is happening in real time, along with information on historical trends, so that users can divine the how and why and redirect future action. As David Nettleton emphasizes, the dashboard’s utility extends beyond monitoring “the current situation”; it also “allows a manager to … make provisions, and take appropriate actions.”….

The dashboard market now extends far beyond the corporate world. In 1994, New York City police commissioner William Bratton adapted former officer Jack Maple’s analog crime maps to create the CompStat model of aggregating and mapping crime statistics. Around the same time, the administrators of Charlotte, North Carolina, borrowed a business idea — Robert Kaplan’s and David Norton’s “total quality management” strategy known as the “Balanced Scorecard” — and began tracking performance in five “focus areas” defined by the City Council: housing and neighborhood development, community safety, transportation, economic development, and the environment. Atlanta followed Charlotte’s example in creating its own city dashboard.

In 1999, Baltimore mayor Martin O’Malley, confronting a crippling crime rate and high taxes, designed CitiStat, “an internal process of using metrics to create accountability within his government.” (This rhetoric of data-tested internal “accountability” is prevalent in early dashboard development efforts.) The project turned to face the public in 2003, when Baltimore launched a website of city operational statistics, which inspired DCStat (2005), Maryland’s StateStat (2007), and NYCStat (2008). Since then, myriad other states and metro areas — driven by a “new managerialist” approach to urban governance, committed to “benchmarking” their performance against other regions, and obligated to demonstrate compliance with sustainability agendas — have developed their own dashboards.

The Open Michigan Mi Dashboard is typical of these efforts. The state website presents data on education, health and wellness, infrastructure, “talent” (employment, innovation), public safety, energy and environment, financial health, and seniors. You (or “Mi”) can monitor the state’s performance through a side-by-side comparison of “prior” and “current” data, punctuated with a thumbs-up or thumbs-down icon indicating the state’s “progress” on each metric. Another click reveals a graph of annual trends and a citation for the data source, but little detail about how the data are actually derived. How the public is supposed to use this information is an open question….(More)”

Why governments need guinea pigs for policies


Jonathan Breckon in the Guardian:”People are unlikely to react positively to the idea of using citizens as guinea pigs; many will be downright disgusted. But there are times when government must experiment on us in the search for knowledge and better policy….

Though history calls into question the ethics of experimentation, unless we try things out, we will never learn. The National Audit Office says that £66bn worth of government projects have no plans to evaluate their impact. It is unethical to roll out policies in this arbitrary way. We have to experiment on a small scale to have a better understanding of how things work before rolling out policies across the UK. This is just as relevant to social policy, as it is to science and medicine, as set out in a new report by the Alliance for Useful Evidence.

Whether it’s the best ways to teach our kids to read, designing programmes to get unemployed people back to work, or encouraging organ donation – if the old ways don’t work, we have to test new ones. And that testing can’t always be done by a committee in Whitehall or in a university lab.

Experimentation can’t happen in isolation. What works in Lewisham or Londonnery, might not work in Lincoln – or indeed across the UK. For instance, there is a huge amount debate around the current practice of teaching children to read and spell using phonics, which was based on a small-scale study in Clackmannanshire, as well as evidence from the US. A government-commissioned review on the evidence for phonics led professor Carole Torgerson, then at York University, to warn against making national policy off the back of just one small Scottish trial.

One way round this problem is to do larger experiments. The increasing use of the internet in public services allows for more and faster experimentation, on a larger scale for lower cost – the randomised controlled trial on voter mobilisation that went to 61 million users in the 2010 US midterm elections, for example. However, the use of the internet doesn’t get us off the ethical hook. Facebook had to apologise after a global backlash to secret psychological tests on their 689,000 users.

Contentious experiments should be approved by ethics committees – normal practice for trials in hospitals and universities.

We are also not interested in freewheeling trial-and-error; robust and appropriate research techniques to learn from experiments are vital. It’s best to see experimentation as a continuum, ranging from the messiness of attempts to try something new to experiments using the best available social science, such as randomised controlled trials.

Experimental government means avoiding an approach where everything is fixed from the outset. What we need is “a spirit of experimentation, unburdened by promises of success”, as recommended by the late professor Roger Jowell, author of the 2003 Cabinet Office report, Trying it out [pdf]….(More)”

Design in policy making


at the Open Policy Making Blog: “….In recent years, notable policy and business experts have been discussing the value of design and ‘design thinking’ as an approach to improving the way Government delivers services in one form or another for (and with) citizens.  Examples include Roger Martin from Rotman Business School, Christian Bason formerly of Mindlab, Marco Steinberg of Sitra, Hilary Cottam of Participle, and many more who have been promoting the use of design as a tool for service transformation.

So what is design and how is it being applied in government?  This is the question that has been posed this week at the Service Design in Government conference in London.  This week is also the launch of some of the Policy Lab tools in the Policy Toolkit.

The Policy Lab have produced a short introduction to design, service design and design thinking.  It serves to explain how we are defining and using the term design in various ways in a policy context as well as provide practical tools and examples of design being used in policy making.

We tend to spot design when it goes wrong: badly laid out forms, websites we can’t navigate, confusing signage, transport links that don’t join together, queues for services that are in demand. Bad design is a time thief.  We can also spot good design when we see it, but how is it achieved?…(More)”

Big Data for Social Good


Introduction to a Special Issue of the Journal “Big Data” by Catlett Charlie and Ghani Rayid: “…organizations focused on social good are realizing the potential as well but face several challenges as they seek to become more data-driven. The biggest challenge they face is a paucity of examples and case studies on how data can be used for social good. This special issue of Big Data is targeted at tackling that challenge and focuses on highlighting some exciting and impactful examples of work that uses data for social good. The special issue is just one example of the recent surge in such efforts by the data science community. …

This special issue solicited case studies and problem statements that would either highlight (1) the use of data to solve a social problem or (2) social challenges that need data-driven solutions. From roughly 20 submissions, we selected 5 articles that exemplify this type of work. These cover five broad application areas: international development, healthcare, democracy and government, human rights, and crime prevention.

“Understanding Democracy and Development Traps Using a Data-Driven Approach” (Ranganathan et al.) details a data-driven model between democracy, cultural values, and socioeconomic indicators to identify a model of two types of “traps” that hinder the development of democracy. They use historical data to detect causal factors and make predictions about the time expected for a given country to overcome these traps.

“Targeting Villages for Rural Development Using Satellite Image Analysis” (Varshney et al.) discusses two case studies that use data and machine learning techniques for international economic development—solar-powered microgrids in rural India and targeting financial aid to villages in sub-Saharan Africa. In the process, the authors stress the importance of understanding the characteristics and provenance of the data and the criticality of incorporating local “on the ground” expertise.

In “Human Rights Event Detection from Heterogeneous Social Media Graphs,” Chen and Neil describe efficient and scalable techniques to use social media in order to detect emerging patterns in human rights events. They test their approach on recent events in Mexico and show that they can accurately detect relevant human rights–related tweets prior to international news sources, and in some cases, prior to local news reports, which could potentially lead to more timely, targeted, and effective advocacy by relevant human rights groups.

“Finding Patterns with a Rotten Core: Data Mining for Crime Series with Core Sets” (Wang et al.) describes a case study with the Cambridge Police Department, using a subspace clustering method to analyze the department’s full housebreak database, which contains detailed information from thousands of crimes from over a decade. They find that the method allows human crime analysts to handle vast amounts of data and provides new insights into true patterns of crime committed in Cambridge…..(More)

An In-Depth Analysis of Open Data Portals as an Emerging Public E-Service


Paper by Martin Lnenicka: “Governments collect and produce large amounts of data. Increasingly, governments worldwide have started to implement open data initiatives and also launch open data portals to enable the release of these data in open and reusable formats. Therefore, a large number of open data repositories, catalogues and portals have been emerging in the world. The greater availability of interoperable and linkable open government data catalyzes secondary use of such data, so they can be used for building useful applications which leverage their value, allow insight, provide access to government services, and support transparency. The efficient development of successful open data portals makes it necessary to evaluate them systematic, in order to understand them better and assess the various types of value they generate, and identify the required improvements for increasing this value. Thus, the attention of this paper is directed particularly to the field of open data portals. The main aim of this paper is to compare the selected open data portals on the national level using content analysis and propose a new evaluation framework, which further improves the quality of these portals. It also establishes a set of considerations for involving businesses and citizens to create eservices and applications that leverage on the datasets available from these portals….(More)”

Information transparency of public administrations. The right of the people to know and the duty to disseminate public information actively


New book by Miguel Angel Blanes Climent:”El presente trabajo de investigación analiza la situación legal y judicial existente en las principales democracias del mundo y en el ámbito de Naciones Unidas, Consejo de Europa y Unión Europea. Se trata, por tanto, de una poderosa herramienta para saber quién, cómo, cuándo, dónde y a qué tipo de información financiada con fondos públicos se puede acceder por parte de los ciudadanos. Y lo que es más importante: qué recursos administrativos y judiciales se pueden presentar cuando la información no es facilitada y cuáles son sus consecuencias disciplinarias, patrimoniales y penales. El trabajo examina con detalle la nueva Ley 19/2013, de 9 de diciembre, de transparencia, acceso a la información pública y buen gobierno, así como la normativa autonómica existente en la materia.
Es objeto de especial estudio el acceso a la información sensible: adjudicatarios y coste final de los contratos públicos; datos urbanísticos y medioambientales; presupuesto y cuentas públicas; sueldos, dietas y viajes de los cargos electos y funcionarios; financiación de partidos políticos, sindicatos y organizaciones empresariales; listas de espera sanitarias y de vivienda; beneficiarios de subvenciones; publicidad institucional; los servicios públicos de interés general prestados por entidades privadas -telecomunicaciones, electricidad, gas, servicios postales- y los concesionarios de servicios públicos -agua, residuos, transporte, sanidad- etc.
La información que se resiste a ser publicada es toda aquella que permite a los ciudadanos controlar la gestión de los asuntos públicos, exigir la rendición de cuentas y denunciar casos de despilfarro o corrupción. El autor acuña el lema: «la transparencia es como la sinceridad: se exige la ajena y se limita la propia».” …(More)

Turning Government Data into Better Public Service


OMB Blog: “Every day, millions of people use their laptops, phones, and tablets to check the status of their tax refund, get the latest forecast from the National Weather Service, book a campsite at one of our national parks, and much more. There were more than 1.3 billion visits to websites across the Federal Government in just the past 90 days.

Today, during Sunshine Week when we celebrate openness and transparency in government, we are pleased to release the Digital Analytics Dashboard, a new window into the way people access the government online. For the first time, you can see how many people are using a Federal Government website, which pages are most popular, and which devices, browsers, and operating systems people are using. We’ll use the data from the Digital Analytics Program to focus our digital service teams on the services that matter most to the American people, and analyze how much progress we are making. The Dashboard will help government agencies understand how people find, access, and use government services online to better serve the public – all while protecting privacy.  The program does not track individuals. It anonymizes the IP addresses of all visitors and then uses the resulting information in the aggregate….(More)

 

The Missing Information That Municipal-Bond Investors Need


Marc Joffe at Governing: “…There are many reasons why the municipal market lacks sophistication in this area, but a big part of the problem has been a lack of free (or even low-cost) financial-statement data. In this regard, some strides are being made. First, the 2009 launch by the Municipal Securities Rulemaking Board (MSRB) of its Electronic Municipal Market Access (EMMA) system gave investors a one-stop shop for municipal financial disclosure. But as the Securities and Exchange Commission (SEC) observed recently, a large number of municipal-bond issuers have been posting their statements late or not at all. The commission’s Municipal Continuing Disclosure Cooperation Initiative has greatly increased the number of statements on EMMA. Finally, late this year the Census Bureau is expected to begin posting federal single-audit submissions online. These packages include the same basic financial statements typically found in municipal market disclosure.

But the simple publication of thousands of voluminous PDFs does not provide the degree of transparency needed to raise the level of municipal-bond-market financial literacy. The vast majority of investors and analysts lack the patience and/or technical skills needed to extract the valuable needles of insight from this haystack of disclosure.

Investors in corporate securities do not face these difficulties. For the last 20 years, company financial reports have been available in textual form on the SEC’s Electronic Data Gathering, Analysis and Retrieval system. As a result, corporate financial-statement data is freely available in convenient forms around the Internet: Yahoo Finance, MarketWatch, Morningstar and your broker’s website are just a few of the places you can find this data.

So while corporate investors can readily compare the financial statistics of a safe company like Apple to an insolvent one like Radio Shack, municipal investors cannot easily perform the same exercise for Dallas and Detroit.

It wasn’t always this way. Between 1909 and 1931, the Census Bureau published an annual volume entitled “Financial Statistics of Cities Having a Population of Over 30,000.” The final edition — available at the St. Louis Federal Reserve’s website — covered 311 American cities and included hundreds of revenue, expenditure, asset and liability data points for each municipality. Unfortunately, ever since 1931, Census financial data on local governments has become less comprehensive, less timely and less comprehensible to the lay user.

In the years after 1931, we lost the understanding that comparative local-government financial statistics were a public good. While we might look to the federal government to once again offer this this information in today’s era of heightened need, it may be challenged to take on this role in an era of sequesters.

But while we may need the private sector to provide this public good, the federal government can greatly reduce the cost of compiling a local-government financial-statement database. The SEC has required companies to file financial statements in text form — rather than via PDF — since the mid-1990s. In 2008, the SEC further standardized company financial reporting by requiring firms to file their statements in the form of eXtensible Business Reporting Language (XBRL), which imposes a consistent format on all filings. To date, neither the SEC nor the MSRB has pursued a similar course with respect to municipal financial disclosure.

Next week, the Data Transparency Coalition, a group that advocates for the use of XBRL, will hold a Financial Regulation Summit featuring numerous congressional representatives and regulators. Perhaps the extension of XBRL to the municipal-bond market can find its way onto the agenda….(More)

Gamification harnesses the power of games to motivate


Kevin Werbach at the Conversation: “Walk through any public area and you’ll see people glued to their phones, playing mobile games like Game of War and Candy Crush Saga. They aren’t alone. 59% of Americans play video games, and contrary to stereotypes, 48% of gamers are women. The US$100 billion video game industry is among the least-appreciated business phenomena in the world today.

But this isn’t an article about video games. It’s about where innovative organizations are applying the techniques that make those games so powerfully engaging: everywhere else.

Gamification is the perhaps-unfortunate name for the growing practice of applying structural elements, design patterns, and psychological insights from game design to business, education, health, marketing, crowdsourcing and other fields. Over the past four years, gamification has gone through a cycle of (over-)hype and (overblown) disappointment common for technological trends. Yet if you look carefully, you’ll see it everywhere.

Tapping into pieces of games

Gamification involves two primary mechanisms. The first is to take design structures from games, such as levels, achievements, points, and leaderboards — in my book, For the Win, my co-author and I label them “game elements” — and incorporate them into activities. The second, more subtle but ultimately more effective, is to mine the rich vein of design techniques that game designers have developed over many years. Good games pull you in and carry you through a journey that remains engaging, using an evolving balance of challenges and a stream of well crafted, actionable feedback.

Many enterprises now use tools built on top of Salesforce.com’s customer relationship management platform to motivate employees through competitions, points and leaderboards. Online learning platforms such as Khan Academy commonly challenge students to “level up” by sprinkling game elements throughout the process. Even games are now gamified: Microsoft’s Xbox One and Sony’s PS4 consoles offer a meta-layer of achievements and trophies to promote greater game-play.

The differences between a gamified system that incorporates good design principles and one that doesn’t aren’t always obvious on the surface. They show up in the results.

Duolingo is an online language-learning app. It’s pervasively and thoughtfully gamified: points, levels, achievements, bonuses for “streaks,” visual progression indicators, even a virtual currency with various ways to spend it. The well integrated gamification is a major differentiator for Duolingo, which happens to be the most successful tool of its kind. With over 60 million registered users, it teaches languages to more people than the entire US public school system.

Most of the initial high-profile cases of gamification were for marketing: for example, USA Network ramped up its engagement numbers with web-based gamified challenges for fans of its shows, and Samsung gave points and badges for learning about its products.

Soon it became clear that other applications were equally promising. Today, organizations are using gamification to enhance employee performance, promote health and wellness activities, improve retention in online learning, help kids with cancer endure their treatment regimen, and teach people how to code, to name just a few examples. Gamification has potential anywhere that motivation is an important element of success.

Gamification works because our responses to games are deeply hard-wired into our psychology. Game design techniques can activate our innate desires to recognize patterns, solve puzzles, master challenges, collaborate with others, and be in the drivers’ seat when experiencing the world around us. They can also create a safe space for experimentation and learning. After all, why not try something new when you know that even if you fail, you’ll get another life?…(More)

What Your Tweets Say About You


at the New Yorker: “How much can your tweets reveal about you? Judging by the last nine hundred and seventy-two words that I used on Twitter, I’m about average when it comes to feeling upbeat and being personable, and I’m less likely than most people to be depressed or angry. That, at least, is the snapshot provided by AnalyzeWords, one of the latest creations from James Pennebaker, a psychologist at the University of Texas who studies how language relates to well-being and personality. One of Pennebaker’s most famous projects is a computer program called Linguistic Inquiry and Word Count (L.I.W.C.), which looks at the words we use, and in what frequency and context, and uses this information to gauge our psychological states and various aspects of our personality….

Take a study, out last month, from a group of researchers based at the University of Pennsylvania. The psychologist Johannes Eichstaedt and his colleagues analyzed eight hundred and twenty-six million tweets across fourteen hundred American counties. (The counties contained close to ninety per cent of the U.S. population.) Then, using lists of words—some developed by Pennebaker, others by Eichstaedt’s team—that can be reliably associated with anger, anxiety, social engagement, and positive and negative emotions, they gave each county an emotional profile. Finally, they asked a simple question: Could those profiles help determine which counties were likely to have more deaths from heart disease?

The answer, it turned out, was yes….

The researchers have a theory: they suggest that “the language of Twitter may be a window into the aggregated and powerful effects of the community context.” They point to other epidemiological studies which have shown that general facts about a community, such as its “social cohesion and social capital,” have consequences for the health of individuals. Broadly speaking, people who live in poorer, more fragmented communities are less healthy than people living in richer, integrated ones.“When we do a sub-analysis, we find that the power that Twitter has is in large part accounted for by community and socioeconomic variables,” Eichstaedt told me when we spoke over Skype. In short, a young person’s negative, angry, and stressed-out tweets might reflect his or her stress-inducing environment—and that same environment may have negative health repercussions for other, older members of the same community….(More)”