Unpacking Civic Tech – Inside and Outside of Government


David Moore at Participatory Politics Foundation: “…I’ll argue it’s important to unpack the big-tent term “civic tech” to at least five major component areas, overlapping in practice & flexible of course – in order to more clearly understand what we have and what we need:

  • Responsive & efficient city services (e.g., SeeClickFix)
  • Open data portals & open government data publishing / visualization (Socrata, OpenGov.com)
  • Engagement platforms for government entities (Mindmixer aka Sidewalk)
  • Community-focused organizing services (Change, NextDoor, Brigade- these could validly be split, as NextDoor is of course place-based IRL)
  • Geo-based services & open mapping data (e.g.. Civic Insight)

More precisely, instead of “civic tech”, the term #GovTech can be productively applied to companies whose primary business model is vending to government entities – some #govtech is #opendata, some is civic #engagement, and that’s healthy & brilliant. But it doesn’t make sense to me to conflate as “civic tech” both government software vendors and the open-data work of good-government watchdogs. Another framework for understanding the inside / outside relationship to government, in company incorporation strategies & priorities, is broadly as follows:

  • tech entirely-outside government (such as OpenCongress or OpenStates);
  • tech mostly-outside government, where some elected officials volunteer to participate (such as AskThem, Councilmatic, DemocracyOS, or Change Decision Makers);
  • tech mostly-inside government, paid-for-by-government (such as Mindmixer or SpeakUp or OpenTownHall) where elected officials or gov’t staff sets the priorities, with the strong expectation of an official response;
  • deep legacy tech inside government, the enterprise vendors of closed-off CRM software to Congressional offices (including major defense contractors!).

These are the websites up and running today in the civic tech ecosystem – surveying them, I see there’s a lot of work still to do on developing advanced metrics towards thicker civic engagement. Towards evaluating whether the existing tools are having the impact we hope and expect them to at their level of capitalization, and to better contextualize the role of very-small non-profit alternatives….

One question to study is whether the highest-capitalized U.S. civic tech companies (Change, NextDoor, Mindmixer, Socrata, possibly Brigade) – which also generally have most users – are meeting ROI on continual engagement within communities.

  • If it’s a priority metric for users of a service to attend a community meeting, for example, are NextDoor or Mindmixer having expected impact?
  • How about metrics on return participation, joining an advocacy group, attending a district meeting with their U.S. reps, organizing peer-to-peer with neighbors?
  • How about writing or annotating their own legislation at the city level, introducing it for an official hearing, and moving it up the chain of government to state and even federal levels for consideration? What actual new popular public policies or systemic reforms are being carefully, collaboratively passed?
  • Do less-capitalized, community-based non-profits (AskThem, 596 Acres, OpenPlans’ much-missed Shareabouts, CKAN data portals, LittleSis, BeNeighbors, PBNYC tools) – with less scale, but with more open-source, open-data tools that can be remixed – improve on the tough metric of ROI on continual engagement or research-impact in the news?…(More)

The Causes, Costs and Consequences of Bad Government Data


Katherine Barrett & Richard Greene in Governing: “Data is the lifeblood of state government. It’s the crucial commodity that’s necessary to manage projects, avoid fraud, assess program performance, keep the books in balance and deliver services efficiently. But even as the trend toward greater reliance on data has accelerated over the past decades, the information itself has fallen dangerously short of the mark. Sometimes it doesn’t exist at all. But worse than that, all too often it’s just wrong.

There are examples everywhere. Last year, the California auditor’s office issued a report that looked at accounting records at the State Controller’s Office to see whether it was accurately recording sick leave and vacation credits. “We found circumstances where instead of eight hours, it was 80 and in one case, 800,” says Elaine Howle, the California state auditor. “And the system didn’t have controls to say that’s impossible.” The audit found 200,000 questionable hours of leave due to data entry errors, with a value of $6 million.

Mistakes like that are embarrassing, and can lead to unequal treatment of valued employees. Sometimes, however, decisions made with bad data can have deeper consequences. In 2012, the secretary of environmental protection in Pennsylvania told Congress that there was no evidence the state’s water quality had been affected by fracking. “Tens of thousands of wells have been hydraulically fractured in Pennsylvania,” he said, “without any indication that groundwater quality has been impacted.”

But by August 2014, the same department published a list of 248 incidents of damage to well water due to gas development. Why didn’t the department pick up on the water problems sooner? A key reason was that the data collected by its six regional offices had not been forwarded to the central office. At the same time, the regions differed greatly in how they collected, stored, transmitted and dealt with the information. An audit concluded that Pennsylvania’s complaint tracking system for water quality was ineffective and failed to provide “reliable information to effectively manage the program.”

When data is flawed, the consequences can reach throughout the entire government enterprise. Services are needlessly duplicated; evaluation of successful programs is difficult; tax dollars go uncollected; infrastructure maintenance is conducted inefficiently; health-care dollars are wasted. The list goes on and on. Increasingly, states are becoming aware of just how serious the problem is. “The poor quality of government data,” says Dave Yost, Ohio’s state auditor, “is probably the most important emerging trend for government executives, across the board, at all levels.”

Just how widespread a problem is data quality? In aGoverning telephone survey with more than 75 officials in 46 states, about 7 out of 10 said that data problems were frequently or often an impediment to doing their business effectively. No one who worked with program data said this was rarely the case. (View the full results of the survey in this infographic.)…(More)

See also: Bad Data Is at All Levels of Government and The Next Big Thing in Data Analytics

Local Governments Need Financial Transparency Tools


Cities of the Future: “Comprehensive financial transparency — allowing anyone to look up the allocation of budgets, expenses by department, and even the ledger of each individual expense as it happens — can help local governments restore residents’ confidence, help manage the budget efficiently and make more informed decisions for new projects and money allocation.

A few weeks ago, we had municipal elections in Spain. Many local governments changed hands and the new administrations had to review the current budgets, see where money was being spent and, on occasion, discovered expenses they were not expecting.

As costs rise and cities find it more difficult to provide the same services without raising taxes, citizens among others are demanding full disclosure of income and expenses.

Tools such as OpenGov platform are helping cities accomplish that goal…Earlier this year the city of Beaufort (pop. 13,000), South Carolina’s second oldest city known for its historic charm and moss-laden oak trees, decided to implement OpenGov. It rolled out the platform to the public last February, becoming the first city in the State to provide the public with in-depth, comprehensive financial data (spanning five budget years).

The reception by the city council and residents was extremely positive. Residents can now look up where their tax money goes down to itemized expenses. They can also see up-to-date charts of every part of the budget, how it is being spent, and what remains to be used. City council members can monitor the administration’s performance and ask informed questions at town meetings about the budget use, right down to the smallest expenses….

Many cities are now implementing open data tools to share information on different aspects of city services, such as transit information, energy use, water management, etc. But those tools are difficult to use and do not provide comprehensive financial information about the use of public money. …(More)”

The digital revolution liberating Latin American people


Luis Alberto Moreno in the Financial Times: “Imagine a place where citizens can deal with the state entirely online, where all health records are electronic and the wait for emergency care is just seven minutes. Singapore? Switzerland? Try Colima, Mexico.

Pessimists fear the digital revolution will only widen social and economic disparities in the developing world — particularly in Latin America, the world’s most unequal region. But Colima, though small and relatively prosperous, shows how some of the region’s governments are harnessing these tools to modernise services, improve quality of life and share the benefits of technology more equitably.

In the past 10 years, this state of about 600,000 people has transformed the way government works, going completely digital. Its citizens can carry out 62 procedures online, from applying for permits to filing crime reports. No internet at home? Colima offers hundreds of free WiFi hotspots.

Colombia and Peru are taking broadband to remote corners of their rugged territories. Bogotá has subsidised the ex­pansion of its fibre optic network, which now links virtually every town in the country. Peru is expanding a programme that aims to bring WiFi to schools, hospitals and other public buildings in each of its 25 regions. The Colombian plan, Vive Digital, fosters internet adoption among all its citizens. Taxes on computers, tablets and smartphones have been scrapped. Low-income families have been given vouchers to sign up for broadband. In five years, the percentage of households connected to the internet jumped from 16 per cent to 50 per cent. Among small businesses it soared from 7 per cent to 61 per cent .

Inexpensive devices and ubiquitous WiFi, however, do not guarantee widespread usage. Diego Molano Vega, an architect of Vive Digital, found that many programs designed for customers in developed countries were ill suited to most Colombians. “There are no poor people in Silicon Valley,” he says. Latin American governments should use their purchasing power to push for development of digital services easily adopted by their citizens and businesses. Chile is a leader: it has digitised hundreds of trámites — bureaucratic procedures involving endless forms and queues. In a 4,300km-longcountry of mountains, deserts and forests, this enables access to all sorts of services through the internet. Entrepreneurs can now register businesses online for free in a single day.

In Chile, entrepreneurs can now register new businesses online for free in a single day

Technology can be harnessed to boost equity in education. Brazil’s Mato Grosso do Sul state launched a free online service to prepare high school students for a tough national exam in which a good grade is a prerequisite for admission to federal universities. On average the results of the students who used the service were 31 per cent higher than those of their peers, prompting 10 other states to adopt the system.

Digital tools can also help raise competitiveness in business. Uruguay’s livestock information system keeps track of the country’s cattle. The publicly financed electronic registry ensures every beast can be traced, making it easier to monitor outbreaks of diseases….(More)”

 

Cities show how to make open data usable


Bianca Spinosa at GCN: “Government agencies have no shortage of shareable data. Data.gov, the open-data clearinghouse that launched in May 2009, had more than 147,331 datasets as of mid-July, and state and local governments are joining federal agencies in releasing ever-broader arrays of information.

The challenge, however, remains making all that data usable. Obama administration officials like to talk about how the government’s weather data supports forecasting and analysis that support businesses and help Americans every day. But relatively few datasets do more than just sit there, and fewer still are truly accessible for the average person.

At the federal level, that’s often because agency missions do not directly affect citizens the way that local governments do. Nevertheless, every agency has customers and communities of interest, and there are lessons feds can learn from how cities are sharing their data with the public.

One such model is Citygram. The app links to a city’s open-data platform and sends subscribers a weekly text or email message about selected activities in their neighborhoods. Charlotte officials worked closely with Code for America fellows to develop the software, and the app launched in December 2014 in that city and in Lexington, Ky.

Three other cities – New York, Seattle, and San Francisco – have since joined, and Orlando, Fla.; Honolulu; the Research Triangle area of North Carolina; and Montgomery County, Md., are considering doing so.

Citygram “takes open data and transforms it, curates it and translates it into human speech,” said Twyla McDermott, Charlotte’s corporate IT program manager. “People want to know what’s happening around them.”

Demonstrating real-world utility

People in the participating cities can go to Citygram.org, select their city and choose topics of interest (such as pending rezonings or new business locations). Then they enter their address and a radius to consider “nearby” and finally select either text or email for their weekly notifications.

Any city government can use the technology, which is open source and freely available on GitHub. San Francisco put its own unique spin on the app by allowing subscribers to sign up for notifications on tree plantings. With Citygram NYC, New Yorkers can find information on vehicle collisions within a radius of up to 4 miles….(More)”

The internet is the answer to all the questions of our time


Cory Doctorow in The Guardian: “…Why do people work for these organisations? Because they are utopians. Not utopians in the sense of believing that the internet is predestined to come out all right no matter what. Rather, we are utopians because, on the one hand, we are terrified of what kind of surveillance and control the internet enables, and because, on the other hand, we believe that the future is up for grabs: that we can work together to change what the internet is and what it will become. Nothing is more utopian than a belief that, when things are bad, we can make them better.

The internet has become the nervous system of the 21st century, wiring together devices that we carry, devices that are in our bodies, devices that our bodies are in. It is woven into the fabric of government service delivery, of war-fighting systems, of activist groups, of major corporations and teenagers’ social groups and the commerce of street-market hawkers.

There are many fights more important than the fight over how the internet is regulated. Equity in race, gender, sexual preference; the widening wealth gap; the climate crisis – each one far more important than the fight over the rules for the net.

Except for one thing: the internet is how every one of these fights will be won or lost. Without a free, fair and open internet, proponents of urgent struggles for justice will be outmaneuvered and outpaced by their political opponents, by the power-brokers and reactionaries of the status quo. The internet isn’t the most important fight we have; but it’s the most foundational….

The questions of the day are “How do we save the planet from the climate crisis?” and “What do we do about misogyny, racial profiling and police violence, and homophobic laws?” and “How do we check mass surveillance and the widening power of the state?” and “How do we bring down autocratic, human-rights-abusing regimes without leaving behind chaos and tragedy?”

Those are the questions.

But the internet is the answer. If you propose to fix any of these things without using the internet, you’re not being serious. And if you want to free the internet to use in all those fights, there’s a quarter century’s worth of Internet Utopians who’ve got your back….(More)

Using social media in hotel crisis management: the case of bed bugs


Social media has helped to bridge the communication gap between customers and hotels. Bed bug infestations are a growing health crisis and have obtained increasing attention on social media sites. Without managing this crisis effectively, bed bug infestation can cause economic loss and reputational damages to hotel properties, ranging from negative comments and complaints, to possible law suits. Thus, it is essential for hoteliers to understand the importance of social media in crisis communication, and to incorporate social media in hotels’ crisis management plans.

This study serves as one of the first attempts in the hospitality field to offer discussions and recommendations on how hotels can manage the bed bug crisis and other crises of this kind by incorporating social media into their crisis management practices….(More)”

The Data Revolution


Review of Rob Kitchin’s The Data Revolution: Big Data, Open Data, Data Infrastructures & their Consequences by David Moats in Theory, Culture and Society: “…As an industry, academia is not immune to cycles of hype and fashion. Terms like ‘postmodernism’, ‘globalisation’, and ‘new media’ have each had their turn filling the top line of funding proposals. Although they are each grounded in tangible shifts, these terms become stretched and fudged to the point of becoming almost meaningless. Yet, they elicit strong, polarised reactions. For at least the past few years, ‘big data’ seems to be the buzzword, which elicits funding, as well as the ire of many in the social sciences and humanities.

Rob Kitchin’s book The Data Revolution is one of the first systematic attempts to strip back the hype surrounding our current data deluge and take stock of what is really going on. This is crucial because this hype is underpinned by very real societal change, threats to personal privacy and shifts in store for research methods. The book acts as a helpful wayfinding device in an unfamiliar terrain, which is still being reshaped, and is admirably written in a language relevant to social scientists, comprehensible to policy makers and accessible even to the less tech savvy among us.

The Data Revolution seems to present itself as the definitive account of this phenomena but in filling this role ends up adopting a somewhat diplomatic posture. Kitchin takes all the correct and reasonable stances on the matter and advocates all the right courses of action but he is not able to, in the context of this book, pursue these propositions fully. This review will attempt to tease out some of these latent potentials and how they might be pushed in future work, in particular the implications of the ‘performative’ character of both big data narratives and data infrastructures for social science research.

Kitchin’s book starts with the observation that ‘data’ is a misnomer – etymologically data should refer to phenomena in the world which can be abstracted, measured etc. as opposed to the representations and measurements themselves, which should by all rights be called ‘capta’. This is ironic because the worst offenders in what Kitchin calls “data boosterism” seem to conflate data with ‘reality’, unmooring data from its conditions of production and making relationship between the two given or natural.

As Kitchin notes, following Bowker (2005), ‘raw data’ is an oxymoron: data are not so much mined as produced and are necessarily framed technically, ethically, temporally, spatially and philosophically. This is the central thesis of the book, that data and data infrastructures are not neutral and technical but also social and political phenomena. For those at the critical end of research with data, this is a starting assumption, but one which not enough practitioners heed. Most of the book is thus an attempt to flesh out these rapidly expanding data infrastructures and their politics….

Kitchin is at his best when revealing the gap between the narratives and the reality of data analysis such as the fallacy of empiricism – the assertion that, given the granularity and completeness of big data sets and the availability of machine learning algorithms which identify patterns within data (with or without the supervision of human coders), data can “speak for themselves”. Kitchin reminds us that no data set is complete and even these out-of-the-box algorithms are underpinned by theories and assumptions in their creation, and require context specific knowledge to unpack their findings. Kitchin also rightly raises concerns about the limits of big data, that access and interoperability of data is not given and that these gaps and silences are also patterned (Twitter is biased as a sample towards middle class, white, tech savy people). Yet, this language of veracity and reliability seems to suggest that big data is being conceptualised in relation to traditional surveys, or that our population is still the nation state, when big data could helpfully force us to reimagine our analytic objects and truth conditions and more pressingly, our ethics (Rieder, 2013).

However, performativity may again complicate things. As Kitchin observes, supermarket loyalty cards do not just create data about shopping, they encourage particular sorts of shopping; when research subjects change their behaviour to cater to the metrics and surveillance apparatuses built into platforms like Facebook (Bucher, 2012), then these are no longer just data points representing the social, but partially constitutive of new forms of sociality (this is also true of other types of data as discussed by Savage (2010), but in perhaps less obvious ways). This might have implications for how we interpret data, the distribution between quantitative and qualitative approaches (Latour et al., 2012) or even more radical experiments (Wilkie et al., 2014). Kitchin is relatively cautious about proposing these sorts of possibilities, which is not the remit of the book, though it clearly leaves the door open…(More)”

Transforming Government Information


Sharyn Clarkson at the (Interim) Digital Transformation Office (Australia): “Our challenge: How do we get the right information and services to people when and where they need it?

The public relies on Government for a broad range of information – advice for individuals and businesses, what services are available and how to access them, and how various rules and laws impact our lives.

The government’s digital environment has grown organically over the last couple of decades. At the moment, information is largely created and managed within agencies and published across more than 1200 disparate gov.au websites, plus a range of social media accounts, apps and other digital formats.

This creates some difficulties for people looking for government information. By publishing within agency silos we are presenting people with an agency-centric view of government information. This is a problem because people largely don’t understand or care about how government organises itself and the structure of government does not map to the needs of people. Having a baby or travelling overseas? Up to a dozen government agencies may have information relevant to you. And as people’s needs span more than one agency, they end up with a disjointed and confusing user experience as they have to navigate across disparate government sites. And even if you begin at your favourite search engine how do you know which of the many government search results is the right place to start?

There are two government entry points already in place to help users – Australia.gov.au and business.gov.au – but they largely act as an umbrella across the 1200+ sites and currently only provide a very thin layer of whole of government information and mainly refer people off to other websites.

The establishment of the DTO has provided the first opportunity for people to come together and better understand how our underlying structural landscape is impacting people’s experience with government. It’s also given us an opportunity to take a step back and ask some of the big questions about how we manage information and what problems can only really be solved through whole of government transformation.

How do we make information and services easier to find? How do we make sure we provide information that people can trust and rely upon at times of need? How should the gov.au landscape be organised to make it easier for us to meet user’s needs and expectations? How many websites should we have – assuming 1200 is too many? What makes up a better user experience – does it mean all sites should look and feel the same? How can we provide government information at the places people naturally go looking for assistance – even if these are not government sites?

As we asked these questions we started to come across some central ideas:

  • What if we could decouple the authoring and management of information from the publishing process, so the subject experts in government still manage their content but we have flexibility to present it in more user-centric ways?
  • What if we unleashed government information? Making it possible for state and local governments, non-profit groups and businesses to deliver content and services alongside their own information to give better value users.
  • Should we move the bureaucratic content (information about agencies and how they are managed such as annual reports, budget statements and operating rules) out of the way of core content and services for people? Can we simplify our environment and base it around topics and life events instead of agencies? What if we had people in government responsible for curating these topics and life events across agencies and creating simpler pathways for users?…(More)”

When America Says Yes to Government


Cass Sunstein in the New York Times: “In recent years, the federal government has adopted a large number of soft interventions that are meant to change behavior without mandates and bans. Among them: disclosure of information, such as calorie labels at chain restaurants; graphic warnings against, for example, distracted driving; and automatic enrollment in programs designed to benefit employees, like pension plans.

Informed by behavioral science, such reforms can have large effects while preserving freedom of choice. But skeptics deride these soft interventions as unjustified paternalism, an insult to dignity and a contemporary version of the nanny state. Some people fear that uses of behavioral science will turn out to be manipulative. They don’t want to be nudged.

But what do Americans actually think about soft interventions? I recently conducted a nationally representative survey of 563 people. Small though that number may seem, it gives a reasonable picture of what Americans think, with a margin of error of plus or minus 4.1 percentage points.

The remarkable finding is that most Americans approve of these reforms and want a lot more of them — and their approval generally cuts across partisan lines….(More)