The Future State CIO: How the Role will Drive Innovation


Report by Accenture/NASCIO: “…exploring the future role of the state CIO and how the state CIO will drive innovation.

The research included interviews and a survey of state CIOs to understand the role of state CIOs in promoting innovation in government.

  • The study explored how state IT organizations build the capacity to innovate and which best practices help in doing so.
  • We also examined how state CIOs embrace new and emerging technologies to create the best government outcomes.
  • Our report illuminates compelling opportunities, persistent obstacles, strategies for accelerating innovation and inspiring real-world case studies.
  • The report presents a set of practical recommendations for driving innovation…(More)”.

Rejuvenating Democracy Promotion


Essay by Thomas Carothers: “Adverse political developments in both established and newer democracies, especially the abdication by the United States of its traditional leadership role, have cast international democracy support into doubt. Yet international action on behalf of democracy globally remains necessary and possible. Moreover, some important elements of continuity remain, including overall Western spending on democracy assistance. Democracy support must adapt to its changed circumstances by doing more to take new geopolitical realities into account; effacing the boundary between support for democracy in new and in established democracies; strengthening the economic dimension of democracy assistance; and moving technological issues to the forefront…(More)”.

Tech groups cannot be allowed to hide from scrutiny


Marietje Schaake at the Financial Times: “Technology companies have governments over a barrel. Whether they are maximising traffic flow efficiency, matching pupils with their school preferences, trying to anticipate drought based on satellite and soil data, most governments heavily rely on critical infrastructure and artificial intelligence developed by the private sector. This growing dependence has profound implications for democracy.

An unprecedented information asymmetry is growing between companies and governments. We can see this in the long-running investigation into interference in the 2016 US presidential elections. Companies build voter registries, voting machines and tallying tools, while social media companies sell precisely targeted advertisements using information gleaned by linking data on friends, interests, location, shopping and search.

This has big privacy and competition implications, yet oversight is minimal. Governments, researchers and citizens risk being blindsided by the machine room that powers our lives and vital aspects of our democracies. Governments and companies have fundamentally different incentives on transparency and accountability.

While openness is the default and secrecy the exception for democratic governments, companies resist providing transparency about their algorithms and business models. Many of them actively prevent accountability, citing rules that protect trade secrets.

We must revisit these protections when they shield companies from oversight. There is a place for protecting proprietary information from commercial competitors, but the scope and context need to be clarified and balanced when they have an impact on democracy and the rule of law.

Regulators must act to ensure that those designing and running algorithmic processes do not abuse trade secret protections. Tech groups also use the EU’s General Data Protection Regulation to deny access to company information. Although the regulation was enacted to protect citizens against the mishandling of personal data, it is now being wielded cynically to deny scientists access to data sets for research. The European Data Protection Supervisor has intervened, but problems could recur. To mitigate concerns about the power of AI, provider companies routinely promise that the applications will be understandable, explainable, accountable, reliable, contestable, fair and — don’t forget — ethical.

Yet there is no way to test these subjective notions without access to the underlying data and information. Without clear benchmarks and information to match, proper scrutiny of the way vital data is processed and used will be impossible….(More)”.

Innovation labs and co-production in public problem solving


Paper by Michael McGann, Tamas Wells & Emma Blomkamp: “Governments are increasingly establishing innovation labs to enhance public problem solving. Despite the speed at which these new units are being established, they have only recently begun to receive attention from public management scholars. This study assesses the extent to which labs are enhancing strategic policy capacity through pursuing more collaborative and citizen-centred approaches to policy design. Drawing on original case study research of five labs in Australia and New Zealand, it examines the structure of lab’s relationships to government partners, and the extent and nature of their activities in promoting citizen-participation in public problem solving….(More)”.

Lies, Deception and Democracy


Essay by Richard Bellamy: “This essay explores how far democracy is compatible with lies and deception, and whether it encourages or discourages their use by politicians. Neo-Kantian arguments, such as Newey’s, that lies and deception undermine individual autonomy and the possibility for consent go too far, given that no democratic process can be regarded as a plausible mechanism for achieving collective consent to state policies. However, they can be regarded as incompatible with a more modest account of democracy as a system of public equality among political equals.

On this view, the problem with lies and deception derives from their being instruments of manipulation and domination. Both can be distinguished from ‘spin’, with a working democracy being capable of uncovering them and so incentivising politicians to be truthful. Nevertheless, while lies and deception will find you out, bullshit and post truth disregard and subvert truth respectively, and as such prove more pernicious as they admit of no standard whereby they might be challenged….(More)”.

Crossing the Digital Divide: Applying Technology to the Global Refugee Crisis


Report by Shelly Culbertson, James Dimarogonas, Katherine Costello, and Serafina Lanna: “In the past two decades, the global population of forcibly displaced people has more than doubled, from 34 million in 1997 to 71 million in 2018. Amid this growing crisis, refugees and the organizations that assist them have turned to technology as an important resource, and technology can and should play an important role in solving problems in humanitarian settings. In this report, the authors analyze technology uses, needs, and gaps, as well as opportunities for better using technology to help displaced people and improving the operations of responding agencies. The authors also examine inherent ethical, security, and privacy considerations; explore barriers to the successful deployment of technology; and outline some tools for building a more systematic approach to such deployment. The study approach included a literature review, semi-structured interviews with stakeholders, and focus groups with displaced people in Colombia, Greece, Jordan, and the United States. The authors provide several recommendations for more strategically using and developing technology in humanitarian settings….(More)”.

On Digital Disinformation and Democratic Myths


 David Karpf at MediaWell: “…How many votes did Cambridge Analytica affect in the 2016 presidential election? How much of a difference did the company actually make?

Cambridge Analytica has become something of a Rorschach test among those who pay attention to digital disinformation and microtargeted propaganda. Some hail the company as a digital Svengali, harnessing the power of big data to reshape the behavior of the American electorate. Others suggest the company was peddling digital snake oil, with outlandish marketing claims that bore little resemblance to their mundane product.

One thing is certain: the company has become a household name, practically synonymous with disinformation and digital propaganda in the aftermath of the 2016 election. It has claimed credit for the surprising success of the Brexit referendum and for the Trump digital strategy. Journalists such as Carole Cadwalladr and Hannes Grasseger and Mikael Krogerus have published longform articles that dive into the “psychographic” breakthroughs that the company claims to have made. Cadwalladr also exposed the links between the company and a network of influential conservative donors and political operatives. Whistleblower Chris Wylie, who worked for a time as the company’s head of research, further detailed how it obtained a massive trove of Facebook data on tens of millions of American citizens, in violation of Facebook’s terms of service. The Cambridge Analytica scandal has been a driving force in the current “techlash,” and has been the topic of congressional hearings, documentaries, mass-market books, and scholarly articles.

The reasons for concern are numerous. The company’s own marketing materials boasted about radical breakthroughs in psychographic targeting—developing psychological profiles of every US voter so that political campaigns could tailor messages to exploit psychological vulnerabilities. Those marketing claims were paired with disturbing revelations about the company violating Facebook’s terms of service to scrape tens of millions of user profiles, which were then compiled into a broader database of US voters. Cambridge Analytica behaved unethically. It either broke a lot of laws or demonstrated that old laws needed updating. When the company shut down, no one seemed to shed a tear.

But what is less clear is just how different Cambridge Analytica’s product actually was from the type of microtargeted digital advertisements that every other US electoral campaign uses. Many of the most prominent researchers warning the public about how Cambridge Analytica uses our digital exhaust to “hack our brains” are marketing professors, more accustomed to studying the impact of advertising in commerce than in elections. The political science research community has been far more skeptical. An investigation from Nature magazine documented that the evidence of Cambridge Analytica’s independent impact on voter behavior is basically nonexistent (Gibney 2018). There is no evidence that psychographic targeting actually works at the scale of the American electorate, and there is also no evidence that Cambridge Analytica in fact deployed psychographic models while working for the Trump campaign. The company clearly broke Facebook’s terms of service in acquiring its massive Facebook dataset. But it is not clear that the massive dataset made much of a difference.

At issue in the Cambridge Analytica case are two baseline assumptions about political persuasion in elections. First, what should be our point of comparison for digital propaganda in elections? Second, how does political persuasion in elections compare to persuasion in commercial arenas and marketing in general?…(More)”.

Copy, Paste, Legislate


The Center for Public Integrity: “Do you know if a bill introduced in your statehouse — it might govern who can fix your shattered iPhone screen or whether you can still sue a pedophile priest years later — was actually written by your elected lawmakers? Use this new tool to find out.

Spoiler alert The answer may well be no.

Thousands of pieces of “model legislation” are drafted each year by business organizations and special interest groups and distributed to state lawmakers for introduction. These copycat bills influence policymaking across the nation, state by state, often with little scrutiny. This news application was developed by the Center for Public Integrity, part of a year-long collaboration with USA TODAY and the Arizona Republic to bring the practice into the light….(More)”.

Open Democracy and Digital Technologies


Paper by Hélène Landemore: “…looks at the connection between democratic theory and technological constraints, and argues for renovating our paradigm of democracy to make the most of the technological opportunities offered by the digital revolution. The most attractive normative theory of democracy currently available—Habermas’ model of a two-track deliberative sphere—is, for all its merits, a self-avowed rationalization of representative democracy, a system born in the 18th century under different epistemological, conceptual, and technological constraints. In this
paper I show the limits of this model and defend instead an alternative paradigm of democracy I call “open democracy,” in which digital technologies are assumed to make it possible to transcend a number of dichotomies, including that between ordinary citizens and democratic representatives.

Rather than just imagining a digitized version or extension of existing institutions and practices—representative democracy as we know it—I thus take the opportunities offered by the digital revolution (its technological “affordances,” in the jargon) to envision new democratic institutions and means of democratic empowerment, some of which are illustrated in the vignette with which this paper started. In other words, rather that start from what is— our electoral democracies, I start from what democracy could mean, if we reinvented it more or less from scratch today with the help of digital technologies.

The first section lays out the problems with and limits of our current practice and theory of democracy.


The second section traces these problems to conceptual design flaws partially induced by 18th century conceptual, epistemological, and technological constraints.


Section three lays out an alternative theory of democracy I call “open democracy,” which avoids some of these design flaws, and introduces the institutional features of this new paradigm that are specifically enabled by digital technologies: deliberation and democratic representation….(More)”.

Meaningful Inefficiencies: Civic Design in an Age of Digital Expediency


Book by Eric Gordon and Gabriel Mugar: “Public trust in the institutions that mediate civic life-from governing bodies to newsrooms-is low. In facing this challenge, many organizations assume that ensuring greater efficiency will build trust. As a result, these organizations are quick to adopt new technologies to enhance what they do, whether it’s a new app or dashboard. However, efficiency, or charting a path to a goal with the least amount of friction, is not itself always built on a foundation of trust.

Meaningful Inefficiencies is about the practices undertaken by civic designers that challenge the normative applications of “smart technologies” in order to build or repair trust with publics. Based on over sixty interviews with change makers in public serving organizations throughout the United States, as well as detailed case studies, this book provides a practical and deeply philosophical picture of civic life in transition. The designers in this book are not professional designers, but practitioners embedded within organizations who have adopted an approach to public engagement Eric Gordon and Gabriel Mugar call “meaningful inefficiencies,” or the deliberate design of less efficient over more efficient means of achieving some ends. This book illustrates how civic designers are creating meaningful inefficiencies within public serving organizations. It also encourages a rethinking of how innovation within these organizations is understood, applied, and sought after. Different than market innovation, civic innovation is not just about invention and novelty; it is concerned with building communities around novelty, and cultivating deep and persistent trust.

At its core, Meaningful Inefficiencies underlines that good civic innovation will never just involve one single public good, but must instead negotiate a plurality of publics. In doing so, it creates the conditions for those publics to play, resulting in people truly caring for the world. Meaningful Inefficiencies thus presents an emergent and vitally needed approach to creating civic life at a moment when smart and efficient are the dominant forces in social and organizational change….(More)”.