Some notes on smart cities and the corporatization of urban governance


Presentation by Constance Carr and Markus Hesse: “We want to address a discrepancy; that is, the discrepancy between processes and practices of technological development on one hand and/or production processes of urban change and urban problems on the other. There’s a gap here, that we can illustrate with the case of the so called“Google City”.

The scholarly literature on digital cities is quite clear that there are externalities, uncertainties and risks associated with the hype around, and the rash introduction of, ‘smartness’. To us, an old saying comes to mind: Don’t put the wagon before the horse.

Obviously, digitization and technology have revolutionized geography in many ways. And, this is nothing new. Roughly twenty years ago, with the rise of the Internet, some, such as MIT’s Bill Mitchell (1995), speculated that it and other ITs would eradicate space into the ‘City of Bits’. However, even back then statements like these didn’t go uncriticised by those who pointed at the inherent technological determinism and exposed that there is a complex relationship between urban development, urban planning, and technological innovation; that the relationship was neither new, nor trivial such that tech, itself, would automatically and necessarily be productive, beneficial, and central to cities.

What has changed is the proliferation of digital technologies and their applications. We agree with Ash et al. (2016) that geography has experienced a ‘digital turn’ where urban geography now produced by, through and of digitization. And, while digitalization of urbanity has provided benefits, it has also come sidelong a number of unsolved problems.

First, behind the production of big data, algorithms, and digital design, there are certain epistemologies – ways of knowing. Data is not value-free. Rather, data is an end product of political and associated methods of framing that structure the production of data. So, now that we “live in a present characterized by a […] diverse array of spatially-enabled digital devices, platforms, applications and services,” (Ash et al. 2016: 28), we can interrogate how these processes and algorithms are informed by socio-economic inequalities, because the risk is that new technologies will simply reproduce them.

Second, the circulation of data around the globe invokes questions about who owns and regulates them when stored and processed in remote geographic locations….(More)”.

Regulating disinformation with artificial intelligence


Paper for the European Parliamentary Research Service: “This study examines the consequences of the increasingly prevalent use of artificial intelligence (AI) disinformation initiatives upon freedom of expression, pluralism and the functioning of a democratic polity. The study examines the trade-offs in using automated technology to limit the spread of disinformation online. It presents options (from self-regulatory to legislative) to regulate automated content recognition (ACR) technologies in this context. Special attention is paid to the opportunities for the European Union as a whole to take the lead in setting the framework for designing these technologies in a way that enhances accountability and transparency and respects free speech. The present project reviews some of the key academic and policy ideas on technology and disinformation and highlights their relevance to European policy.

Chapter 1 introduces the background to the study and presents the definitions used. Chapter 2 scopes the policy boundaries of disinformation from economic, societal and technological perspectives, focusing on the media context, behavioural economics and technological regulation. Chapter 3 maps and evaluates existing regulatory and technological responses to disinformation. In Chapter 4, policy options are presented, paying particular attention to interactions between technological solutions, freedom of expression and media pluralism….(More)”.

Toward an Open Data Bias Assessment Tool Measuring Bias in Open Spatial Data


Working Paper by Ajjit Narayanan and Graham MacDonald: “Data is a critical resource for government decisionmaking, and in recent years, local governments, in a bid for transparency, community engagement, and innovation, have released many municipal datasets on publicly accessible open data portals. In recent years, advocates, reporters, and others have voiced concerns about the bias of algorithms used to guide public decisions and the data that power them.

Although significant progress is being made in developing tools for algorithmic bias and transparency, we could not find any standardized tools available for assessing bias in open data itself. In other words, how can policymakers, analysts, and advocates systematically measure the level of bias in the data that power city decisionmaking, whether an algorithm is used or not?

To fill this gap, we present a prototype of an automated bias assessment tool for geographic data. This new tool will allow city officials, concerned residents, and other stakeholders to quickly assess the bias and representativeness of their data. The tool allows users to upload a file with latitude and longitude coordinates and receive simple metrics of spatial and demographic bias across their city.

The tool is built on geographic and demographic data from the Census and assumes that the population distribution in a city represents the “ground truth” of the underlying distribution in the data uploaded. To provide an illustrative example of the tool’s use and output, we test our bias assessment on three datasets—bikeshare station locations, 311 service request locations, and Low Income Housing Tax Credit (LIHTC) building locations—across a few, hand-selected example cities….(More)”

Africa Data Revolution Report 2018


Report by Jean-Paul Van Belle et al: ” The Africa Data Revolution Report 2018 delves into the recent evolution and current state of open data – with an emphasis on Open Government Data – in the African data communities. It explores key countries across the continent, researches a wide range of open data initiatives, and benefits from global thematic expertise. This second edition improves on process, methodology and collaborative partnerships from the first edition.

It draws from country reports, existing global and continental initiatives, and key experts’ input, in order to provide a deep analysis of the
actual impact of open data in the African context. In particular, this report features a dedicated Open Data Barometer survey as well as a special 2018
Africa Open Data Index regional edition surveying the status and impact of open data and dataset availability in 30 African countries. The research is complemented with six in-depth qualitative case studies featuring the impact of open data in Kenya, South Africa (Cape Town), Ghana, Rwanda, Burkina Faso and Morocco. The report was critically reviewed by an eminent panel of experts.

Findings: In some governments, there is a slow iterative cycle between innovation, adoption, resistance and re-alignment before finally resulting in Open Government Data (OGD) institutionalization and eventual maturity. There is huge diversity between African governments in embracing open data, and each country presents a complex and unique picture. In several African countries, there appears to be genuine political will to open up government based datasets, not only for increased transparency but also to achieve economic impacts, social equity and stimulate innovation.

The role of open data intermediaries is crucial and has been insufficiently recognized in the African context. Open data in Africa needs a vibrant, dynamic, open and multi-tier data ecosystem if the datasets are to make a real impact. Citizens are rarely likely to access open data themselves. But the democratization of information and communication platforms has opened up opportunities among a large and diverse set of intermediaries to explore and combine relevant data sources, sometimes with private or leaked data. The news media, NGOs and advocacy groups, and to a much lesser extent academics and social or profit-driven entrepreneurs have shown that OGD can create real impact on the achievement of the SDGs…

The report encourages national policy makers and international funding or development agencies to consider the status, impact and future of open
data in Africa on the basis of this research. Other stakeholders working with or for open data can hopefully  also learn from what is happening on the continent. It is hoped that the findings and recommendations contained in the report will form the basis of a robust, informed and dynamic debate around open government data in Africa….(More)”.

EU Data Protection Rules and U.S. Implications


In Focus by the Congressional Research Service: “U.S. and European citizens are increasingly concerned about ensuring the protection of personal data, especially online. A string of high-profile data breaches at companies such as Facebook and Google have contributed to heightened public awareness. The European Union’s (EU) new General Data Protection Regulation (GDPR)—which took effect on May 25, 2018—has drawn the attention of U.S. businesses and other stakeholders, prompting debate on U.S. data privacy and protection policies.

Both the United States and the 28-member EU assert that they are committed to upholding individual privacy rights and ensuring the protection of personal data, including electronic data. However, data privacy and protection issues have long been sticking points in U.S.-EU economic and security relations, in part because of differences in U.S. and EU legal regimes and approaches to data privacy.

The GDPR highlights some of those differences and poses challenges for U.S. companies doing business in the EU. The United States does not broadly restrict cross-border data flows and has traditionally regulated privacy at a sectoral level to cover certain types of data. The EU considers the privacy of communications and the protection of personal data to be fundamental rights, which are codified in EU law. Europe’s history with fascist and totalitarian regimes informs the EU’s views on data protection and contributes to the demand for strict data privacy controls. The EU regards current U.S. data protection safeguards as inadequate; this has complicated the conclusion of U.S.-EU information-sharing agreements and raised concerns about U.S.-EU data flows….(More).

Data Trusts: Ethics, Architecture and Governance for Trustworthy Data Stewardship


Web Science Institute Paper by Kieron O’Hara: “In their report on the development of the UK AI industry, Wendy Hall and Jérôme Pesenti
recommend the establishment of data trusts, “proven and trusted frameworks and agreements” that will “ensure exchanges [of data] are secure and mutually beneficial” by promoting trust in the use of data for AI. Hall and Pesenti leave the structure of data trusts open, and the purpose of this paper is to explore the questions of (a) what existing structures can data trusts exploit, and (b) what relationship do data trusts have to
trusts as they are understood in law?

The paper defends the following thesis: A data trust works within the law to provide ethical, architectural and governance support for trustworthy data processing

Data trusts are therefore both constraining and liberating. They constrain: they respect current law, so they cannot render currently illegal actions legal. They are intended to increase trust, and so they will typically act as
further constraints on data processors, adding the constraints of trustworthiness to those of law. Yet they also liberate: if data processors
are perceived as trustworthy, they will get improved access to data.

Most work on data trusts has up to now focused on gaining and supporting the trust of data subjects in data processing. However, all actors involved in AI – data consumers, data providers and data subjects – have trust issues which data trusts need to address.

Furthermore, it is not only personal data that creates trust issues; the same may be true of any dataset whose release might involve an organisation risking competitive advantage. The paper addresses four areas….(More)”.

Harnessing the Power of Open Data for Children and Families


Article by Kathryn L.S. Pettit and Rob Pitingolo: “Child advocacy organizations, such as members of the KIDS COUNT network, have proven the value of using data to advocate for policies and programs to improve the lives of children and families. These organizations use data to educate policymakers and the public about how children are faring in their communities. They understand the importance of high-quality information for policy and decisionmaking. And in the past decade, many state governments have embraced the open data movement. Their data portals promote government transparency and increase data access for a wide range of users inside and outside government.

At the request of the Annie E. Casey Foundation, which funds the KIDS COUNT network, the authors conducted research to explore how these state data efforts could bring greater benefits to local communities. Interviews with child advocates and open data providers confirmed the opportunity for child advocacy organizations and state governments to leverage open data to improve the lives of children and families. But accomplishing this goal will require new practices on both sides.

This brief first describes the current state of practice for child advocates using data and for state governments publishing open data. It then provides suggestions for what it would take from both sides to increase the use of open data to improve the lives of children and families. Child and family advocates will find five action steps in section 2. These steps encourage them to assess their data needs, build relationships with state data managers, and advocate for new data and preservation of existing data.
State agency staff will find five action steps in section 3. These steps describe how staff can engage diverse stakeholders, including agency staff beyond typical “data people” and data users outside government. Although this brief focuses on state-level institutions, local advocates an governments will find these lessons relevant. In fact, many of the lessons and best practices are based on pioneering efforts at the local level….(More)”.

Big Data and Dahl’s Challenge of Democratic Governance


Alex Ingrams in the Review of Policy Research: “Big data applications have been acclaimed as potentially transformative for the public sector. But, despite this acclaim, most theory of big data is narrowly focused around technocratic goals. The conceptual frameworks that situate big data within democratic governance systems recognizing the role of citizens are still missing. This paper explores the democratic governance impacts of big data in three policy areas using Robert Dahl’s dimensions of control and autonomy. Key impacts and potential tensions are highlighted. There is evidence of impacts on both dimensions, but the dimensions conflict as well as align in notable ways and focused policy efforts will be needed to find a balance….(More)”.

Blockchain and distributed ledger technologies in the humanitarian sector


Report by Giulio Coppi and Larissa Fast at ODI (Overseas Development Institute): “Blockchain and the wider category of distributed ledger technologies (DLTs) promise a more transparent, accountable, efficient and secure way of exchanging decentralised stores of information that are independently updated, automatically replicated and immutable. The key components of DLTs include shared recordkeeping, multi-party consensus, independent validation, tamper evidence and tamper resistance.

Building on these claims, proponents suggest DLTs can address common problems of non-profit organisations and NGOs, such as transparency, efficiency, scale and sustainability. Current humanitarian uses of DLT, illustrated in this report, include financial inclusion, land titling, remittances, improving the transparency of donations, reducing fraud, tracking support to beneficiaries from multiple sources, transforming governance systems, micro-insurance, cross-border transfers, cash programming, grant management and organisational governance.

This report, commissioned by the Global Alliance for Humanitarian Innovation (GAHI), examines current DLT uses by the humanitarian sector to outline lessons for the project, policy and system levels. It offers recommendations to address the challenges that must be overcome before DLTs can be ethically, safely, appropriately and effectively scaled in humanitarian contexts….(More)”.

Evolving Measurement for an Evolving Economy: Thoughts on 21st Century US Economic Statistics


Ron S. Jarmin at the Journal of Economic Perspectives: “The system of federal economic statistics developed in the 20th century has served the country well, but the current methods for collecting and disseminating these data products are unsustainable. These statistics are heavily reliant on sample surveys. Recently, however, response rates for both household and business surveys have declined, increasing costs and threatening quality. Existing statistical measures, many developed decades ago, may also miss important aspects of our rapidly evolving economy; moreover, they may not be sufficiently accurate, timely, or granular to meet the increasingly complex needs of data users. Meanwhile, the rapid proliferation of online data and more powerful computation make privacy and confidentiality protections more challenging. There is broad agreement on the need to transform government statistical agencies from the 20th century survey-centric model to a 21st century model that blends structured survey data with administrative and unstructured alternative digital data sources. In this essay, I describe some work underway that hints at what 21st century official economic measurement will look like and offer some preliminary comments on what is needed to get there….(More)”.