How to Hold Governments Accountable for the Algorithms They Use


 in Slate: “In 2015 more than 59 million Americans received some form ofbenefit from the Social Security Administration, not just for retirement but also for disability or as a survivor of a deceased worker. It’s a behemoth of a government program, and keeping it solvent has preoccupied the Office of the Chief Actuary of theSocial Security Administration for years. That office makes yearly forecasts of key demographic (such as mortality rates) or economic (for instance, labor forceparticipation) factors that inform how policy can or should change to keep theprogram on sound financial footing. But a recent Harvard University study examinedseveral of these forecasts and found that they were systematically biased—underestimating life expectancy and implying that funds were on firmer financialground than warranted. The procedures and methods that the SSA uses aren’t openfor inspection either, posing challenges to replicating and debugging those predictivealgorithms.

Whether forecasting the solvency of social programs, waging a war, managingnational security, doling out justice and punishment, or educating the populace,government has a lot of decisions to make—and it’s increasingly using algorithms tosystematize and scale that bureaucratic work. In the ideal democratic state, theelectorate chooses a government that provides social goods and exercises itsauthority via regulation. The government is legitimate to the extent that it is heldaccountable to the citizenry. Though as the SSA example shows, tightly heldalgorithms pose issues of accountability that grind at the very legitimacy of thegovernment itself.

One of the immensely useful abilities of algorithms is to rank and prioritize hugeamounts of data, turning a messy pile of items into a neat and orderly list. In 2013 theObama administration announced that it would be getting into the business ofranking colleges, helping the citizens of the land identify and evaluate the “best”educational opportunities. But two years later, the idea of ranking colleges had beenneutered, traded in for what amounts to a data dump of educational statistics calledthe College Scorecard. The human influences, subjective factors, and methodologicalpitfalls involved in quantifying education into rankings would be numerous. Perhapsthe government sensed that any ranking would be dubious—that it would be riddledwith questions of what data was used and how various statistical factors wereweighted. How could the government make such a ranking legitimate in the eyes ofthe public and of the industry that it seeks to hold accountable?

That’s a complicated question that goes far beyond college rankings. But whatever theend goal, government needs to develop protocols for opening up algorithmic blackboxes to democratic processes.

Transparency offers one promising path forward. Let’s consider the new risk-assessment algorithm that the state of Pennsylvania is developing to help make criminal sentencing decisions. Unlike some other states that are pursuing algorithmiccriminal justice using proprietary systems, the level of transparency around thePennsylvania Risk Assessment Project is laudable, with several publicly available in-depth reports on the development of the system….(More)’

Global fact-checking up 50% in past year


Mark Stencel at Duke Reporters’ Lab: “The high volume of political truth-twisting is driving demand for political fact-checkers around the world, with the number of fact-checking sites up 50 percent since last year.

The Duke Reporters’ Lab annual census of international fact-checking currently counts 96 active projects in 37 countries. That’s up from 64 active fact-checkers in the 2015 count. (Map and List)

Active Fact-checkers 2016A bumper crop of new fact-checkers across the Western Hemisphere helped increase the ranks of journalists and government watchdogs who verify the accuracy of public statements and track political promises. The new sites include 14 in the United States, two in Canada as well as seven additional fact-checkers in Latin America.There also were new projects in 10 other countries, from North Africa to Central Europe to East Asia…..

The growing numbers have even spawned a new global association, the International Fact-Checking Network hosted by the Poynter Institute, a media training center in St. Petersburg, Florida.

Promises, Promises

Some of the growth has come in the form of promise-tracking. Since January 2015, fact-checkers launched six sites in five countries devoted to tracking the status of pledges candidates and party leaders made in political campaigns. In Tunisia, there are two new sites dedicated to promise-tracking — one devoted to the country’s president and the other to its prime minister.

There are another 20 active fact-checkers elsewhere that track promises,…

Nearly two-thirds of the active fact-checkers (61 of 96, or 64 percent) are directly affiliated with a new organization. However this breakdown reflects the dominant business structure in the United States, where 90 percent of fact-checkers are part of a news organization. That includes nine of 11 national projects and 28 of 30 state/local fact-checkers…The story is different outside the United States, where less than half of the active fact-checking projects (24 of 55, or 44 percent) are affiliated with news organizations.

The other fact-checkers are typically associated with non-governmental, non-profit and activist groups focused on civic engagement, government transparency and accountability. A handful are partisan, especially in conflict zones and in countries where the lines between independent media, activists and opposition parties are often blurry and where those groups are aligned against state-controlled media or other governmental and partisan entities….(More)

This is the #CitizenShift


The New Citizenship Project (UK): “… it’s become increasingly apparent that the shift from Consumer to Citizen isn’t just something that ought to happen, but something that is ALREADY happening, across the world and in all aspects of society.

It’s also become clear that the story we’re working with is one that many people in many organisations find exciting and empowering.  Thinking of people as Citizens rather than as Consumers is a powerful platform for ideas and initiatives that can genuinely make a difference in the world.

That’s why we’ve decided now is the time to share this story, bringing all our research and emerging practice together in one report… The report is called The Citizen Shift: A guide to understanding and embracing the emerging era of the Citizen.  It looks back over the 20th and early 21stcentury, exploring the shifting idea of the role of the individual in society across the period from Subject to Consumer, and now to Citizen.  Drawing on ideas from academic disciplines including behavioural economics, evolutionary biology, and philosophy, and on examples of practice across government, business and civil society, the report makes a compelling case that a moment of rare opportunity is upon us: we all have agency in making the most of the dynamics that make the shift from Consumer to Citizen not just a possibility, but the emerging reality….(More)”

Open government data and why it matters


Australian Government: “This was a key focus of the Prime Minister’s $1.1 billion innovation package announced this month.

The Bureau of Communications Research (BCR) today released analysis of the impact of open government data, revealing its potential to generate up to $25 billion per year, or 1.5 per cent of Australia’s GDP.

In Australia, users can already access and re-use more than 7000 government data sets published on data.gov.au,’ said Dr Paul Paterson, Chief Economist and Head of the Bureau of Communications Research (BCR).

‘Some of the high-value data sets include geospatial/mapping data, health data, transport data, mining data, environmental data, demographics data, and real-time emergency data.

‘Many Australians are unaware of the flow-on benefits from open government data as a result of the increased innovation and informed choice it creates. For example open data has the power to generate new careers, more efficient government revenues, improved business practices, and drive better public engagement,

New Tools for Collaboration: The Experience of the U.S. Intelligence Community


IBM Center for Business of Government: “This report is intended for an audience beyond the U.S. Intelligence Community—senior managers in government, their advisors and students of government performance who are interested in the progress of collaboration in a difficult environment. …

The purpose of this report is to learn lessons by looking at the use of internal collaborative tools across the Intelligence Community. The initial rubric was tools, but the real focus is collaboration, for while the tools can enable, what ultimately matters are policies and practices interacting with organizational culture. It looks for good practices to emulate. The ultimate question is how and how much could, and should, collaborative tools foster integration across the Community. The focus is analysis and the analytic process, but collaborative tools can and do serve many other functions in the Intelligence Community—from improving logistics or human resources, to better connecting collection and analysis, to assisting administration and development, to facilitating, as one interlocutor put it, operational “go” decisions. Yet it is in the analytic realm that collaboration is both most visible and most rubs against traditional work processes that are not widely collaborative.

The report defines terms and discusses concepts, first exploring collaboration and coordination, then defining collaborative tools and social media, then surveying the experience of the private sector. The second section of the report uses those distinctions to sort out the blizzard of collaborative tools that have been created in the various intelligence agencies and across them. The third section outlines the state of collaboration, again both within agencies and across them. The report concludes with findings and recommendations for the Community. The recommendations amount to a continuum of possible actions in making more strategic what is and will continue to be more a bottom-up process of creating and adopting collaborative tools and practices….(More)”

The rise of the citizen expert


Beth Noveck (The GovLab) at Policy Network: “Does the EU need to be more democratic? It is not surprising that Jürgen Habermas, Europe’s most famous democratic theorist, laments the dearth of mechanisms for “fulfilling the citizens’ political will” in European institutions. The controversial handling of the Greek debt crisis, according to Habermas, was clear evidence of the need for more popular input into otherwise technocratic decision-making. Incremental progress toward participation does not excuse a growing crisis of democratic legitimacy that, he says, is undermining the European project….

For participatory democrats like Habermas, opportunities for deliberative democratic input by citizens is essential to legitimacy. And, to be sure, the absence of such opportunities is no guarantee of more effective outcomes. A Greek referendum in July 2015 scuttled European austerity plans.

But pitting technocracy against citizenship is a false dichotomy resulting from the long-held belief, even among reformers, that only professional public servants or credentialed elites possess the requisite abilities to govern in a complex society. Citizens are spectators who can express opinions but cognitive incapacity, laziness or simply the complexity of modern society limit participation to asking people what they feel by means of elections, opinion polls, or social media.

Although seeing technocracy as the antinomy of citizenship made sense when expertise was difficult to pinpoint, now tools like LinkedIn, which make knowhow more searchable, are making it possible for public institutions to get more help from more diverse sources – including from within the civil service – systematically and could enable more members of the public to participate actively in governing based on what they know and care about. It is high time for institutions to begin to leverage such platforms to match the need for expertise to the demand for it and, in the process, increase engagement becoming more effective and more legitimate.

Such software does more than catalogue credentials. The internet is radically decreasing the costs of identifying diverse forms of expertise so that the person who has taken courses on an online learning platform can showcase those credentials with a searchable digital badge. The person who has answered thousands of questions on a question-and-answer website can demonstrate their practical ability and willingness to help. Ratings by other users further attest to the usefulness of their contributions. In short, it is becoming possible to discover what people know and can do in ever more finely tuned ways and match people to opportunities to participate that speak to their talents….

In an era in which it is commonplace for companies to use technology to segment customers in an effort to promote their products more effectively, the idea of matching might sound obvious. To be sure, it is common practice in business – but in the public sphere, the notion that participation should be tailored to the individual’s abilities and tethered to day-to-day practices of governing, not politicking, is new.  More accurately, it is a revival of Athenian life where citizen competence and expertise were central to economic and military success.

What makes this kind of targeted engagement truly democratic – and citizenship in this vision more active, robust, and meaningful – is that such targeting allows us to multiply the number and frequency of ways to engage productively in a manner consistent with each person’s talents. When we move away from focusing on citizen opinion to discovering citizen expertise, we catalyse participation that is also independent of geographical boundaries….(More)”

Moving from Open Data to Open Knowledge: Announcing the Commerce Data Usability Project


Jeffrey Chen, Tyrone Grandison, and Kristen Honey at the US Department of Commerce: “…in 2016, the DOC is committed to building on this momentum with new and expanded efforts to transform open data into knowledge into action.

DOC Open Data Graphic
Graphic Credit: Radhika Bhatt, Commerce Data Service

DOC has been in the business of open data for a long time. DOC’s National Oceanic and Atmospheric Administration (NOAA) alone collects and disseminates huge amounts of data that fuel the global weather economy—and this information represents just a fraction of the tens of thousands of datasets that DOC collects and manages, on topics ranging from satellite imagery to material standards to demographic surveys.

Unfortunately, far too many DOC datasets are either hard to find, difficult to use, and/or not yet publicly available on Data.gov, the home of U.S. government’s open data. This challenge is not exclusive to DOC; and indeed, under Project Open Data, Federal agencies are working hard on various efforts to make tax-payer funded data more easily discoverable.

CDUP screenshot

One of these efforts is DOC’s Commerce Data Usability Project (CDUP). To unlock the power of data, just making data open isn’t enough. It’s critical to make data easier to find and use—to provide information and tools that make data accessible and actionable for all users. That’s why DOC formed a public-private partnership to create CDUP, a collection of online data tutorials that provide students, developers, and entrepreneurs with the necessary context and code for them to start quickly extracting value from various datasets. Tutorials exist on topics such as:

  • NOAA’s Severe Weather Data Inventory (SWDI), demonstrating how to use hail data to save life and property. The tutorial helps users see that hail events often occur in the summer (late night to early morning), and in midwestern and southern states.
  • Security vulnerability data from the National Institute of Standards and Technology (NIST). The tutorial helps users see that spikes and dips in security incidents consistently occur in the same set of weeks each year.
  • Visible Infrared Imaging Radiometer Suite (VIIRS) data from the National Oceanic and Atmospheric Administration (NOAA). The tutorial helps users understand how to use satellite imagery to estimate populations.
  • American Community Survey (ACS) data from the U.S. Census Bureau. The tutorial helps users understand how nonprofits can identify communities that they want to serve based on demographic traits.

In the coming months, CDUP will continue to expand with a rich, diverse set of additional tutorials….(More)

Designing for Respect: UX Ethics for the Digital Age


O’Reilly Publishing: Although designers are responsible for orchestrating the behaviors of all sorts of interactions on the Web, mobile devices, and in consumer environments every day, they often forget—or don’t fully realize—the influence they have on others.

With this O’Reilly report, you’ll examine the subject of design with an ethical lens, and focus specifically on how UX, interaction, graphic, and visual product designers can affect a user’s time, mood, and trust. Author David Hindman, Interaction Design Director at Fjord San Francisco, investigates the topic of respectful design by providing examples of the challenges and frameworks to help inform considerate design solutions.

Designers and business owners alike will examine some of the most commonly used digital services from an ethical standpoint. This report will help you:

  • Recognize deceitful patterns, and learn how to create more efficient and honest solutions
  • Understand the impact of respectful design on business
  • Create more efficient and honest solutions
  • Raise awareness about the value of clarity and respect from digital services…(More)”

 

Big-data analytics: the power of prediction


Rachel Willcox in Public Finance: “The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead…

Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time….

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says….

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety….(More)”

 

7 Ways Local Governments Are Getting Creative with Data Mapping


Ben Miller at GovTech:  “As government data collection expands, and as more of that data becomes publicly available, more people are looking to maps as a means of expressing the information.

And depending on the type of application, a map can be useful for both the government and its constituents. Many maps help government servants operate more efficiently and savemoney, while others will answer residents’ questions so they don’t have to call a government worker for theanswer…..

Here are seven examples of state and local governments using maps to help themselves and the people they serve.

1. DISTRICT OF COLUMBIA, IOWA GET LOCAL AND CURRENT WITH THE WEATHER

Washington%2C+D.C.+snow+plow+map

As Winter Storm Jonas was busy dropping nearly 30 inches of snow on the nation’s capital, officials in D.C. were working to clear it. And thanks to a mapping application they launched, citizens could see exactly how the city was going about that business.

The District of Columbia’s snow map lets users enter an address, and then shows what snow plows did near that address within a given range of days. The map also shows where the city received 311 requests for snow removal and gives users a chance to look at recent photos from road cameras showing driving conditions…..

2. LOS ANGELES MAPS EL NIÑO RESOURCES, TRENDS

El Niño Watch map

Throughout the winter, weather monitoring experts warned the public time and again that an El Niño system was brewing in the Pacific Ocean that looked to be one of the largest, if not the largest, ever. That would mean torrents of rain for a parched state that’s seen mudslides and flooding during storms in the past.

So to prepare its residents, the city of Los Angeles published a map in January that lets users see both decision-informing trends and the location of resources. Using the application, one can toggle layers that let them know what the weather is doing around the city, where traffic is backed up, where the power is out, where they can find sand bags to prevent flood damage and more….

3. CALIFORNIA DIVES DEEP INTO AIR POLLUTION RISKS

CalEnviroScreen

….So, faced with a legislative mandate to identify disadvantaged communities, the California Office of Environmental Health Hazard Assessment decided that it wouldn’t just examine smog levels — it also would also take a look at the prevalence of at-risk people across the state.

The result is a series of three maps, the first two examining both factors and the third combining them. That allows the state and its residents to see the places where air pollution is the biggest problem for people it poses a greater risk to….

4. STREAMLINING RESIDENT SERVICE INFORMATION

Manassas+curbside+pickup+map

The city of Manassas, Va., relied on an outdated paper map and a long-time, well-versed staffer to answer questions about municipal curbside pickup services until they launched this map in 2014. The map allows users to enter their address, and then gives them easy-to-read information about when to put out various things on their curb for pickup.

That’s useful because the city’s fall leaf collection schedule changes every year. So the map not only acts as a benefit to residents who want information, but to city staff who don’t have to deal with as many calls.

The map also shows users the locations of resources they can use and gives them city phone numbers in case they still have questions, and displays it all in a popup pane at the bottom of the map.

5. PLACING TOOLS IN THE HANDS OF THE PUBLIC

A lot of cities and counties have started publishing online maps showing city services and releasing government data.

But Chicago, Boston and Philadelphia stand out as examples of maps that take the idea one step further — because each one offers a staggering amount of choices for users.

Chicago’s new OpenGrid map, just launched in January, is a versatile map that lets users search for certain data like food inspection reports, street closures, potholes and more. That’s enough to answer a lot of questions, but what adds even more utility is the map’s various narrowing tools. Users can narrow searches to a zip code, or they can draw a shape on the map and only see results within that shape. They can perform sub-searches within results and they can choose how they’d like to see the data displayed.

Philadelphia’s platform makes use of buttons, icons and categories to help users sift through the spatially-enabled data available to them. Options include future lane closures, bicycle paths, flu shots, city resources, parks and more.

Boston’s platform is open for users to submit their own maps. And submit they have. The city portal offers everything from maps of bus stops to traffic data pulled from the Waze app.

6. HOUSTON TRANSFORMS SERVICE REQUEST DATA

Houston+311+service+request+map

A 311 service functions as a means of bringing problems to city staff’s attention. But the data itself only goes so far — it needs interpretation.

Houston’s 311 service request map helps users easily analyze the data so as to spot trends. The tool offers lots of ways to narrow data down, and can isolate many different kinds of request so users can see whether one problem is reported more often in certain areas.

7. GUIDING BUSINESS GROWTH

For the last several years, the city of Rancho Cucamonga, Calif., has been designing all sorts of maps through its Rancho Enterprise Geographic Information Systems (REGIS) project. Many of them have served specific city purposes, such as tracking code enforcement violations and offering police a command system tool for special events.

The utilitarian foundation of REGIS extends to its public-facing applications as well. One example is INsideRancho, a map built with economic development efforts in mind. The map lets users search and browse available buildings to suit business needs, narrowing results by square footage, zoning and building type. Users can also find businesses by name or address, and look at property exteriors via an embedded connection with Google Street View….(More)”