Boston experimented with using generative AI for governing. It went surprisingly well


Article by Santiago Garces and Stephen Goldsmith: “…we see the possible advances of generative AI as having the most potential. For example, Boston asked OpenAI to “suggest interesting analyses” after we uploaded 311 data. In response, it suggested two things: time series analysis by case time, and a comparative analysis by neighborhood. This meant that city officials spent less time navigating the mechanics of computing an analysis, and had more time to dive into the patterns of discrepancy in service. The tools make graphs, maps, and other visualizations with a simple prompt. With lower barriers to analyze data, our city officials can formulate more hypotheses and challenge assumptions, resulting in better decisions.

Not all city officials have the engineering and web development experience needed to run these tests and code. But this experiment shows that other city employees, without any STEM background, could, with just a bit of training, utilize these generative AI tools to supplement their work.

To make this possible, more authority would need to be granted to frontline workers who too often have their hands tied with red tape. Therefore, we encourage government leaders to allow workers more discretion to solve problems, identify risks, and check data. This is not inconsistent with accountability; rather, supervisors can utilize these same generative AI tools, to identify patterns or outliers—say, where race is inappropriately playing a part in decision-making, or where program effectiveness drops off (and why). These new tools will more quickly provide an indication as to which interventions are making a difference, or precisely where a historic barrier is continuing to harm an already marginalized community.  

Civic groups will be able to hold government accountable in new ways, too. This is where the linguistic power of large language models really shines: Public employees and community leaders alike can request that tools create visual process maps, build checklists based on a description of a project, or monitor progress compliance. Imagine if people who have a deep understanding of a city—its operations, neighborhoods, history, and hopes for the future—can work toward shared goals, equipped with the most powerful tools of the digital age. Gatekeepers of formerly mysterious processes will lose their stranglehold, and expediters versed in state and local ordinances, codes, and standards, will no longer be necessary to maneuver around things like zoning or permitting processes. 

Numerous challenges would remain. Public workforces would still need better data analysis skills in order to verify whether a tool is following the right steps and producing correct information. City and state officials would need technology partners in the private sector to develop and refine the necessary tools, and these relationships raise challenging questions about privacy, security, and algorithmic bias…(More)”

Managing smart city governance – A playbook for local and regional governments


Report by UN Habitat” “This playbook and its recommendations are primarily aimed at municipal governments and their political leaders, local administrators, and public officials who are involved in smart city initiatives. The recommendations, which are delineated in the subsequent sections of this playbook, are intended to help develop more effective, inclusive, and sustainable governance practices for urban digital transformations. The guidance offered on these pages could also be useful for national agencies, private companies, non-governmental organizations, and all stakeholders committed to promoting the sustainable development of urban communities through the implementation of smart city initiatives…(More)”.

Cities are ramping up to make the most of generative AI


Blog by Citylab: “Generative artificial intelligence promises to transform the way we work, and city leaders are taking note. According to a recent survey by Bloomberg Philanthropies in partnership with the Centre for Public Impact, the vast majority of mayors (96 percent) are interested in how they can use generative AI tools like ChatGPT—which rely on machine learning to identify patterns in data and create, or generate, new content after being fed prompts—to improve local government. Of those cities surveyed, 69 percent report that they are already exploring or testing the technology. Specifically, they’re interested in how it can help them more quickly and successfully address emerging challenges with traffic and transportation, infrastructure, public safety, climate, education, and more.  

Yet even as a majority of city leaders surveyed are exploring generative AI’s potential, only a small fraction of them (2 percent) are actively deploying the technology. They indicated there are a number of issues getting in the way of broader implementation, including a lack of technical expertise, budgetary constraints, and ethical considerations like security, privacy, and transparency…(More)”.

City Science


Book by Ramon Gras, and Jeremy Burke: “The Aretian team, a spin off company from the Harvard Innovation Lab, has developed a city science methodology to evaluate the relationship between city form and urban performance. This book illuminates the relationship between a city’s spatial design and quality of life it affords for the general population. By measuring innovation economies to design Innovation Districts, social networks and patterns to help form organization patterns, and city topology, morphology, entropy and scale to create 15 Minute Cities are some of the frameworks presented in this volume.
Therefore, urban designers, architects and engineers will be able to successfully tackle complex urban design challenges by using the authors’ frameworks and findings in their own work. Case studies help to present key insights from advanced, data-driven geospatial analyses of cities around the world in an illustrative manner. This inaugural book by Aretian Urban Analytics and Design will give readers a new set of tools to learn from, expand, and develop for the healthy growth of cities and regions around the world…(More)”.

Smart City Data Governance


OECD Report: “Smart cities leverage technologies, in particular digital, to generate a vast amount of real-time data to inform policy- and decision-making for an efficient and effective public service delivery. Their success largely depends on the availability and effective use of data. However, the amount of data generated is growing more rapidly than governments’ capacity to store and process them, and the growing number of stakeholders involved in data production, analysis and storage pushes cities data management capacity to the limit. Despite the wide range of local and national initiatives to enhance smart city data governance, urban data is still a challenge for national and city governments due to: insufficient financial resources; lack of business models for financing and refinancing of data collection; limited access to skilled experts; the lack of full compliance with the national legislation on data sharing and protection; and data and security risks. Facing these challenges is essential to managing and sharing data sensibly if cities are to boost citizens’ well-being and promote sustainable environments…(More)”

AI Globalism and AI Localism: Governing AI at the Local Level for Global Benefit


Article by Stefaan G. Verhulst: “With the UK Summit in full swing, 2023 will likely be seen as a pivotal year for AI governance, with governments promoting a global governance model: AI Globalism. For it to be relevant, flexible, and effective, any global approach will need to be informed by and complemented with local experimentation and leadership, ensuring local responsiveness: AI Localism.

Even as consumers and businesses extend their use of AI (generative AI in particular), governments are also taking notice. Determined not to be caught on the back foot, as they were with social media, regulators and policymakers around the world are exploring frameworks and institutional structures that could help maximize the benefits while minimizing the potential harms of AI. This week, the UK is hosting a high-profile AI Safety Summit, attended by political and business leaders from around the world, including Kamala Harris and Elon Musk. Similarly, US President Biden recently signed an Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, which he hailed as a “landmark executive order” to ensure “safety, security, trust, openness, and American leadership.”

Generated with DALL-E

Amid the various policy and regulatory proposals swirling around, there has been a notable emphasis on what we might call AI globalism. The UK summit has explicitly endorsed a global approach to AI safety, with coordination between the US, EU, and China core to its vision of more responsible and safe AI. This global perspective follows similar recent calls for “an AI equivalent of the IPCC” or the International Atomic Energy Agency (IAEA). Notably, such calls are emerging both from the private sector and from civil society leaders.

In many ways, a global approach makes sense. Like most technology, AI is transnational in scope, and its governance will require cross-jurisdictional coordination and harmonization. At the same time, we believe that AI globalism should be accompanied by a recognition that some of the most innovative AI initiatives are taking place in cities and municipalities and being regulated at those levels too.

We call it AI localism. In what follows, I outline a vision of a more decentralized approach to AI governance, one that would allow cities and local jurisdictions — including states — to develop and iterate governance frameworks tailored to their specific needs and challenges. This decentralized, local approach would need to take place alongside global efforts. The two would not be mutually exclusive but instead necessarily complementary…(More)”.

Urban Development and the State of Open Data


Chapter by Stefaan G. Verhulst and Sampriti Saxena: “Nearly 4.4 billion people, or about 55% of the world’s population, lived in cities in 2018. By 2045, this number is anticipated to grow to 6 billion. Such level of growth requires innovative and targeted urban solutions. By more effectively leveraging open data, cities can meet the needs of an ever-growing population in an effective and sustainable manner. This paper updates the previous contribution by Jean-Noé Landry, titled “Open Data and Urban Development” in the 2019 edition of The State of Open Data. It also aims to contribute to a further deepening of the Third Wave of Open Data, which highlights the significance of open data at the subnational level as a more direct and immediate response to the on-the-ground needs of citizens. It considers recent developments in how the use of, and approach to, open data has evolved within an urban development context. It seeks to discuss emerging applications of open data in cities, recent developments in open data infrastructure, governance and policies related to open data, and the future outlook of the role of open data in urbanization…(More)”.

Governing Urban Data for the Public Interest


Report by The New Hanse: “…This report represents the culmination of our efforts and offers actionable guidelines for European cities seeking to harness the power of data for the public good.

The key recommendations outlined in the report are:

1. Shift the Paradigm towards Democratic Control of Data: Advocate for a policy that defaults to making urban data accessible, requiring private data holders to share in the public interest.

2. Provide Legal Clarity in a Dynamic Environment: Address legal uncertainties by balancing privacy and confidentiality needs with the public interest in data accessibility, working collaboratively with relevant authorities at national and EU level.

3. Build a Data Commons Repository of Use cases: Streamline data sharing efforts by establishing a standardised use case repository with common technical frameworks, procedures, and contracts.

4. Set up an Urban Data Intermediary for the Public Interest: Institutionalise data sharing, by building urban data intermediaries to address complexities, following principles of public purpose, transparency, and accountability.

5. Learning from the Hamburg Experiment and Scale it across Europe: Embrace experimentation as a vital step, even if outcomes are uncertain, to adapt processes for future innovations. Experiments at the local level can inform policy and scale nationally and across Europe…(More)”.

Evidence-Based Government Is Alive and Well


Article by Zina Hutton: “A desire to discipline the whimsical rule of despots.” That’s what Gary Banks, a former chairman of Australia’s Productivity Commission, attributed the birth of evidence-based policy to back in the 14th century in a speech from 2009. Evidence-based policymaking isn’t a new style of government, but it’s one with well-known roadblocks that elected officials have been working around in order to implement it more widely.

Evidence-based policymaking relies on evidence — facts, data, expert analysis — to shape aspects of long- and short-term policy decisions. It’s not just about collecting data, but also applying it and experts’ analysis to shape future policy. Whether it’s using school enrollment numbers to justify building a new park in a neighborhood or scientists collaborating on analysis of wastewater to try to “catch” illness spread in a community before it becomes unmanageable, evidence-based policy uses facts to help elected and appointed officials decide what funds and other resources to allocate in their communities.

Problems with evidence-based governing have been around for years. They range from a lack of communication between the people designing the policy and its related programs and the people implementing them, to the way that local government struggles to recruit and maintain employees. Resource allocation also shapes the decisions some cities make when it comes to seeking out and using data. This can be seen in the way larger cities, with access to proportionately larger budgets, research from state universities within city limits and a larger workforce, have had more success with evidence-based policymaking.
“The largest cities have more personnel, more expertise, more capacity, whether that’s for collecting administrative data and monitoring it, whether that’s doing open data portals, or dashboards, or whether that’s doing things like policy analysis or program evaluation,” says Karen Mossberger, the Frank and June Sackton Professor in the School of Public Affairs at Arizona State University. “It takes expert personnel, it takes people within government with the skills and the capacity, it takes time.”

Roadblocks aside, state and local governments are finding innovative ways to collaborate with one another on data-focused projects and policy, seeking ways to make up for the problems that impacted early efforts at evidence-based governance. More state and local governments now recruit data experts at every level to collect, analyze and explain the data generated by residents, aided by advances in technology and increased access to researchers…(More)”.

NYC Releases Plan to Embrace AI, and Regulate It


Article by Sarah Holder: “New York City Mayor Eric Adams unveiled a plan for adopting and regulating artificial intelligence on Monday, highlighting the technology’s potential to “improve services and processes across our government” while acknowledging the risks.

The city also announced it is piloting an AI chatbot to answer questions about opening or operating a business through its website MyCity Business.

NYC agencies have reported using more than 30 tools that fit the city’s definition of algorithmic technology, including to match students with public schools, to track foodborne illness outbreaks and to analyze crime patterns. As the technology gets more advanced, and the implications of algorithmic bias, misinformation and privacy concerns become more apparent, the city plans to set policy around new and existing applications…

New York’s strategy, developed by the Office of Technology and Innovation with the input of city agency representatives and outside technology policy experts, doesn’t itself establish any rules and regulations around AI, but lays out a timeline and blueprint for creating them. It emphasizes the need for education and buy-in both from New York constituents and city employees. Within the next year, the city plans to start to hold listening sessions with the public, and brief city agencies on how and why to use AI in their daily operations. The city has also given itself a year to start work on piloting new AI tools, and two to create standards for AI contracts….

Stefaan Verhulst, a research professor at New York University and the co-founder of The GovLab, says that especially during a budget crunch, leaning on AI offers cities opportunities to make evidence-based decisions quickly and with fewer resources. Among the potential use cases he cited are identifying areas most in need of affordable housing, and responding to public health emergencies with data…(More) (Full plan)”.