Paper by Huaxiong Jiang, Stan Geertman & Patrick Witte: “This paper argues for a specific urban planning perspective on smart governance that we call “smart urban governance,” which represents a move away from the technocratic way of governing cities often found in smart cities. A framework on smart urban governance is proposed on the basis of three intertwined key components, namely spatial, institutional, and technological components. To test the applicability of the framework, we conducted an international questionnaire survey on smart city projects. We then identified and discursively analyzed two smart city projects—Smart Nation Singapore and Helsinki Smart City—to illustrate how this framework works in practice. The questionnaire survey revealed that smart urban governance varies remarkably: As urban issues differ in different contexts, the governance modes and relevant ICT functionalities applied also differ considerably. Moreover, the case analysis indicates that a focus on substantive urban challenges helps to define appropriate modes of governance and develop dedicated technologies that can contribute to solving specific smart city challenges. The analyses of both cases highlight the importance of context (cultural, political, economic, etc.) in analyzing interactions between the components. In this, smart urban governance promotes a sociotechnical way of governing cities in the “smart” era by starting with the urban issue at stake, promoting demand-driven governance modes, and shaping technological intelligence more socially, given the specific context….(More)”.
Paper by Florian Eyert, Florian Irgmaier, and Lena Ulbricht: “In this article, we take forward recent initiatives to assess regulation based on contemporary computer technologies such as big data and artificial intelligence. In order to characterize current phenomena of regulation in the digital age, we build on Karen Yeung’s concept of “algorithmic regulation,” extending it by building bridges to the fields of quantification, classification, and evaluation research, as well as to science and technology studies. This allows us to develop a more fine‐grained conceptual framework that analyzes the three components of algorithmic regulation as representation, direction, and intervention and proposes subdimensions for each. Based on a case study of the algorithmic regulation of Uber drivers, we show the usefulness of the framework for assessing regulation in the digital age and as a starting point for critique and alternative models of algorithmic regulation….(More)”.
Paper by Heather McKay, Sara Haviland, and Suzanne Michael: “There is increasing interest in sharing data across agencies and even between states that was once siloed in separate agencies. Driving this is a need to better understand how people experience education and work, and their pathways through each. A data-sharing approach offers many possible advantages, allowing states to leverage pre-existing data systems to conduct increasingly sophisticated and complete analyses. However, information sharing across state organizations presents a series of complex challenges, one of which is the central role trust plays in building successful data-sharing systems. Trust building between organizations is therefore crucial to ensuring project success.
This brief examines the process of building trust within the context of the development and implementation of the Multistate Longitudinal Data Exchange (MLDE). The brief is based on research and evaluation activities conducted by Rutgers’ Education & Employment Research Center (EERC) over the past five years, which included 40 interviews with state leaders and the Western Interstate Commission for Higher Education (WICHE) staff, observations of user group meetings, surveys, and MLDE document analysis. It is one in a series of MLDE briefs developed by EERC….(More)”.
Data Collaborative Case Study by Michelle Winowatan, Andrew J. Zahuranec, Andrew Young, and Stefaan Verhulst: “Following the 2015 earthquake in Nepal, Flowminder, a data analytics nonprofit, and NCell, a mobile operator in Nepal, formed a data collaborative. Using call detail records (CDR, a type of mobile operator data) provided by NCell, Flowminder estimated the number of people displaced by the earthquake and their location. The result of the analysis was provided to various humanitarian agencies responding to the crisis in Nepal to make humanitarian aid delivery more efficient and targeted.
Data Collaboratives Model: Based on our typology of data collaborative practice areas, the initiative follows the trusted intermediary model of data collaboration, specifically a third-party analytics approach. Third-party analytics projects involve trusted intermediaries — such as Flowminder — who access private-sector data, conduct targeted analysis, and share insights with public or civil sector partners without sharing the underlying data. This approach enables public interest uses of private-sector data while retaining strict access control. It brings outside data expertise that would likely not be available otherwise using direct bilateral collaboration between data holders and users….(More)”.
Research article by Abhishek Nagaraj, Esther Shears, and Mathijs de Vaan: “Data access is critical to empirical research, but past work on open access is largely restricted to the life sciences and has not directly analyzed the impact of data access restrictions. We analyze the impact of improved data access on the quantity, quality, and diversity of scientific research. We focus on the effects of a shift in the accessibility of satellite imagery data from Landsat, a NASA program that provides valuable remote-sensing data. Our results suggest that improved access to scientific data can lead to a large increase in the quantity and quality of scientific research. Further, better data access disproportionately enables the entry of scientists with fewer resources, and it promotes diversity of scientific research….(More)”
Case Notes by Mitchell B. Weiss and Sarah Mehta: “By April 7, 2020, over 1.4 million people worldwide had contracted the novel coronavirus (COVID-19). Governments raced to curb the spread of COVID-19 by scaling up testing, quarantining those infected, and tracing their possible contacts. It had taken Singapore’s Government Technology Agency (GovTech) and Ministry of Health (MOH) all of eight weeks to develop the world’s first nationwide deployment of a Bluetooth-based contact tracing system, TraceTogether, and deploy it in an attempt to slow the spread of COVID-19. From late January to mid-March 2020, GovTech’s Jason Bay and his team raced to create a technology that would supplement the work of Singapore’s human contact tracers. Days after its launch, Singapore’s foreign minister announced plans to open source the technology. Now, in early April, TraceTogether was a beta for the world. Whether the system would really help in Singapore, and whether other countries should adopt it was still a wide-open question….(More)”.
Paper by Adrian Smith & Pedro Prieto Martín: “Digital platforms for urban democracy are analyzed in Madrid and Barcelona. These platforms permit citizens to debate urban issues with other citizens; to propose developments, plans, and policies for city authorities; and to influence how city budgets are spent. Contrasting with neoliberal assumptions about Smart Citizenship, the technopolitics discourse underpinning these developments recognizes that the technologies facilitating participation have themselves to be developed democratically. That is, technopolitical platforms are built and operate as open, commons-based processes for learning, reflection, and adaptation. These features prove vital to platform implementation consistent with aspirations for citizen engagement and activism….(More)”.
Paper by Robert M Gonzalez, Matthew Harvey and Foteini Tzachrista: “Empirical evidence on the effectiveness of grassroots monitoring is mixed. This paper proposes a previously unexplored mechanism that may explain this result. We argue that the presence of credible and effective top-down monitoring alternatives can undermine citizen participation in grassroots monitoring efforts. Building on Olken’s (2009) road-building field experiment in Indonesia; we find a large and robust effect of the participation interventions on missing expenditures in villages without an audit in place. However, this effect vanishes as soon as an audit is simultaneously implemented in the village. We find evidence of crowding-out effects: in government audit villages, individuals are less likely to attend, talk, and actively participate in accountability meetings. They are also significantly less likely to voice general problems, corruption-related problems, and to take serious actions to address these problems. Despite policies promoting joint implementation of top-down and bottom-up interventions, this paper shows that top-down monitoring can undermine rather than complement grassroots efforts….(More)”.
Essay collection edited by Nesta: “China is striding ahead of the rest of the world in terms of its investment in artificial intelligence (AI), rate of experimentation and adoption, and breadth of applications. In 2017, China announced its aim of becoming the world leader in AI technology by 2030. AI innovation is now a key national priority, with central and local government spending on AI estimated to be in the tens of billions of dollars.
While Europe and the US are also following AI strategies designed to transform the public sector, there has been surprisingly little analysis of what practical lessons can be learnt from China’s use of AI in public services. Given China’s rapid progress in this area, it is important for the rest of the world to pay attention to developments in China if it wants to keep pace.
This essay collection finds that examining China’s experience of public sector innovation offers valuable insights for policymakers. Not everything is applicable to a western context – there are social, political and ethical concerns that arise from China’s use of new technologies in public services and governance – but there is still much that can be learned from its experience while also acknowledging what should be criticized and avoided….(More)”.
Podcast Episode by Jill Lepore: “In 1966, just as the foundations of the Internet were being imagined, the federal government considered building a National Data Center. It would be a centralized federal facility to hold computer records from each federal agency, in the same way that the Library of Congress holds books and the National Archives holds manuscripts. Proponents argued that it would help regulate and compile the vast quantities of data the government was collecting. Quickly, though, fears about privacy, government conspiracies, and government ineptitude buried the idea. But now, that National Data Center looks like a missed opportunity to create rules about data and privacy before the Internet took off. And in the absence of government action, corporations have made those rules themselves….(More)”.