How to stop our cities from being turned into AI jungles


Stefaan G. Verhulst at The Conversation: “As artificial intelligence grows more ubiquitous, its potential and the challenges it presents are coming increasingly into focus. How we balance the risks and opportunities is shaping up as one of the defining questions of our era. In much the same way that cities have emerged as hubs of innovation in culture, politics, and commerce, so they are defining the frontiers of AI governance.

Some examples of how cities have been taking the lead include the Cities Coalition for Digital Rights, the Montreal Declaration for Responsible AI, and the Open Dialogue on AI Ethics. Others can be found in San Francisco’s ban of facial-recognition technology, and New York City’s push for regulating the sale of automated hiring systems and creation of an algorithms management and policy officer. Urban institutes, universities and other educational centres have also been forging ahead with a range of AI ethics initiatives.

These efforts point to an emerging paradigm that has been referred to as AI Localism. It’s a part of a larger phenomenon often called New Localism, which involves cities taking the lead in regulation and policymaking to develop context-specific approaches to a variety of problems and challenges. We have also seen an increased uptake of city-centric approaches within international law frameworks

Below are ten principles to help systematise our approach to AI Localism. Considered together, they add up to an incipient framework for implementing and assessing initiatives around the world:…(More)”.

E-Government Survey 2022


UN Report: “The United Nations E-Government Survey 2022 is the 12th edition of the United Nations’ assessment of the digital government landscape across all 193 Member States. The E-Government Survey is informed by over two decades of longitudinal research, with a ranking of countries based on the United Nations E-Government Development Index (EGDI), a combination of primary data (collected and owned by the United Nations Department of Economic and Social Affairs) and secondary data from other UN agencies

This edition of the Survey includes data analysis in global and regional contexts, a study of local e-government development based on the United Nations Local Online Service Index (LOSI), consideration of inclusion in the hybrid digital society, and a concluding chapter that outlines the trends and developments related to the future of digital government. As wish all editions, it features extensive annexes on its data, methodology and related pilot study initiatives…(More)”.

How Game Design Principles Can Enhance Democracy


Essay by Adrian Hon: “Gamification — the use of ideas from game design for purposes beyond entertainment — is everywhere. It’s in our smartwatches, cajoling us to walk an extra thousand steps for a digital trophy. It’s in our classrooms, where teachers use apps to reward and punish children with points. And it’s in our jobs, turning the work of Uber drivers and call center staff into quests and missions, where success comes with an achievement and $50 bonus, and failure — well, you can imagine.

Many choose to gamify parts of their lives to make them a little more fun, like learning a new language with Duolingo or going for a run with my own Zombies, Run! app. But the gamification we’re most likely to encounter in our lives is something we have no control over — in our increasingly surveilled and gamified workplaces, for instance, or through the creeping advance of manipulative gamification in financial, insurance, travel and health services.

In my new book, “You’ve Been Played,” I argue that governments must regulate gamification so that it respects workers’ privacy and dignity. Regulators must also ensure that gamified finance apps and video games don’t manipulate users into losing more money than they can afford. Crucially, I believe any gamification intended for schools and colleges must be researched and debated openly before deployment.

But I also believe gamification can strengthen democracies, by designing democratic participation to be accessible and to build consensus. The same game design ideas that have made video games the 21st century’s dominant form of entertainment — adaptive difficulty, responsive interfaces, progress indicators and multiplayer systems that encourage co-operative behaviour — can be harnessed in the service of democracies and civil society…

Fully participating in democracy today — not just voting, but getting involved in local planning and budgeting processes, or building and sharing knowledge — involves navigating increasingly complex systems that desperately need to be made more welcoming and accessible. So while the idea of gamifying democracy may seem to trivialize the deep problems we face today or be another instance of techno-solutionism, that’s not my intention. It’s a recognition that we already live in a digital democracy — one where deliberation takes place on social media that’s gamified to reward and promote the hottest takes and most divisive comments by means of upvotes and karma points; where people learn about the world through the warped lens of conspiracy theories that resemble alternate reality games; and where collective action is enabled and amplified by popularity contests on crowdfunding websites and Reddit.

“The same game design ideas that have made video games the 21st century’s dominant form of entertainment can be harnessed in the service of democracies and civil society.”…(more)”

Working with AI: Real Stories of Human-Machine Collaboration


Book by Thomas H. Davenport and Steven M. Miller: “This book breaks through both the hype and the doom-and-gloom surrounding automation and the deployment of artificial intelligence-enabled—“smart”—systems at work. Management and technology experts Thomas Davenport and Steven Miller show that, contrary to widespread predictions, prescriptions, and denunciations, AI is not primarily a job destroyer. Rather, AI changes the way we work—by taking over some tasks but not entire jobs, freeing people to do other, more important and more challenging work. By offering detailed, real-world case studies of AI-augmented jobs in settings that range from finance to the factory floor, Davenport and Miller also show that AI in the workplace is not the stuff of futuristic speculation. It is happening now to many companies and workers.These cases include a digital system for life insurance underwriting that analyzes applications and third-party data in real time, allowing human underwriters to focus on more complex cases; an intelligent telemedicine platform with a chat-based interface; a machine learning-system that identifies impending train maintenance issues by analyzing diesel fuel samples; and Flippy, a robotic assistant for fast food preparation. For each one, Davenport and Miller describe in detail the work context for the system, interviewing job incumbents, managers, and technology vendors. Short “insight” chapters draw out common themes and consider the implications of human collaboration with smart systems…(More)”.

The Data Liberation Project 


About: “The Data Liberation Project is a new initiative I’m launching today to identify, obtain, reformat, clean, document, publish, and disseminate government datasets of public interest. Vast troves of government data are inaccessible to the people and communities who need them most. These datasets are inaccessible. The Process:

  • Identify: Through its own research, as well as through consultations with journalists, community groups, government-data experts, and others, the Data Liberation Project aims to identify a large number of datasets worth pursuing.
  • Obtain: The Data Liberation Project plans to use a wide range of methods to obtain the datasets, including via Freedom of Information Act requests, intervening in lawsuits, web-scraping, and advanced document parsing. To improve public knowledge about government data systems, the Data Liberation Project also files FOIA requests for essential metadata, such as database schemas, record layouts, data dictionaries, user guides, and glossaries.
  • Reformat: Many datasets are delivered to journalists and the public in difficult-to-use formats. Some may follow arcane conventions or require proprietary software to access, for instance. The Data Liberation Project will convert these datasets into open formats, and restructure them so that they can be more easily examined.
  • Clean: The Data Liberation Project will not alter the raw records it receives. But when the messiness of datasets inhibits their usefulness, the project will create secondary, “clean” versions of datasets that fix these problems.
  • Document: Datasets are meaningless without context, and practically useless without documentation. The Data Liberation Project will gather official documentation for each dataset into a central location. It will also fill observed gaps in the documentation through its own research, interviews, and analysis.
  • Disseminate: The Data Liberation Project will not expect reporters and other members of the public simply to stumble upon these datasets. Instead, it will reach out to the newsrooms and communities that stand to benefit most from the data. The project will host hands-on workshops, webinars, and other events to help others to understand and use the data.”…(More)”

OECD Guidelines for Citizen Participation Processes


OECD: “The OECD Guidelines for Citizen Participation Processes are intended for any public official or public institution interested in carrying out a citizen participation process. The guidelines describe ten steps for designing, planning, implementing and evaluating a citizen participation process, and discuss eight different methods for involving citizens: information and data, open meetings, public consultations, open innovation, citizen science, civic monitoring, participatory budgeting and representative deliberative processes. The guidelines are illustrated with examples as well as practical guidance built on evidence gathered by the OECD. Finally, nine guiding principles are presented to help ensure the quality of these processes…(More)”.

Applications of an Analytic Framework on Using Public Opinion Data for Solving Intelligence Problems


Report by the National Academies of Sciences, Engineering, and Medicine: “Measuring and analyzing public opinion comes with tremendous challenges, as evidenced by recent struggles to predict election outcomes and to anticipate mass mobilizations. The National Academies of Sciences, Engineering, and Medicine publication Measurement and Analysis of Public Opinion: An Analytic Framework presents in-depth information from experts on how to collect and glean insights from public opinion data, particularly in conditions where contextual issues call for applying caveats to those data. The Analytic Framework is designed specifically to help intelligence community analysts apply insights from the social and behavioral sciences on state-of-the-art approaches to analyze public attitudes in non- Western populations. Sponsored by the intelligence community, the National Academies’ Board on Behavioral, Cognitive, and Sensory Sciences hosted a 2-day hybrid workshop on March 8–9, 2022, to present the Analytic Framework and to demonstrate its application across a series of hypothetical scenarios that might arise for an intelligence analyst tasked with summarizing public attitudes to inform a policy decision. Workshop participants explored cutting-edge methods for using large-scale data as well as cultural and ethical considerations for the collection and use of public opinion data. This publication summarizes the presentations and discussions of the workshop…(More)”.

Why Funders Should Go Meta


Paper by Stuart Buck & Anna Harvey: “We don’t mean the former Facebook. Rather, philanthropies should prefer to fund meta-issues—i.e., research and evaluation, along with efforts to improve research quality. In many cases, it would be far more impactful than what they are doing now.

This is true at two levels.

First, suppose you want to support a certain cause–economic development in Africa, or criminal justice reform in the US, etc. You could spend millions or even billions on that cause.

But let’s go meta: a force multiplier would be funding high-quality research on what works on those issues. If you invest significantly in social and behavioral science research, you might find innumerable ways to improve on the existing status quo of donations.

Instead of only helping the existing nonprofits who seek to address economic development or criminal justice reform, you’d be helping to figure out what works and what doesn’t. The result could be a much better set of investments for all donors.

Perhaps some of your initial ideas end up not working, when exhaustively researched. At worst, that’s a temporary embarrassment, but it’s actually all for the better—now you and others know to avoid wasting more money on those ideas. Perhaps some of your favored policies are indeed good ideas (e.g., vaccination), but don’t have anywhere near enough take-up by the affected populations. Social and behavioral science research (as in the Social Science Research Council’s Mercury Project) could help find cost-effective ways to solve that problem…(More)”.

Building the analytic capacity to support critical technology strategy


Paper by Erica R.H. Fuchs: “Existing federal agencies relevant to the science and technology enterprise are appropriately focused on their missions, but the U.S. lacks the intellectual foundations, data infrastructure, and analytics to identify opportunities where the value of investment across missions (e.g., national security, economic prosperity, social well-being) is greater than the sum of its parts.

The U.S. government lacks systematic mechanisms to assess the nation’s strengths, weaknesses, and opportunities in technology and to assess the long chain of suppliers involved in producing products critical to national missions.

Two examples where modern data and analytics—leveraging star interdisciplinary talent from across the nation—and a cross-mission approach could transform outcomes include 1) the difficulties the federal government had in facilitating the production and distribution of personal protective equipment in spring 2020, and 2) the lack of clarity about the causes and solutions to the semiconductor shortage. Going forward, the scale-up of electric vehicles promises similar challenges…

The critical technology analytics (CTA) would identify 1) how emerging technologies and institutional innovations could potentially transform timely situational awareness of U.S. and global technology capabilities, 2) opportunities for innovation to transform U.S. domestic and international challenges, and 3) win-win opportunities across national missions. The program would be strategic and forward-looking, conducting work on a timeline of months and years rather than days and weeks, and would seek to generalize lessons from individual cases to inform the data and analytics capabilities that the government needs to build to support cross-mission critical technology policy…(More)”.

A ‘Feminist’ Server to Help People Own Their Own Data


Article by Padmini Ray Murray: “All of our digital lives reside on servers – mostly in corporate server farms owned by the likes of Google, Amazon, Apple, and Microsoft.  These farms contain machines that store massive volumes of data generated by every single user of the internet. These vast infrastructures allow people to store, connect, and exchange information on the internet. 

Consequently, there is a massive distance between users and where and how the data is stored, which means that individuals have very little control over how their data is stored and used. However, due to the huge reliance on these massive corporate technologies, individuals are left with very little choice but to accept the terms dictated by these businesses. The conceptual alternative of the feminist server was created by groups of feminist and queer activists who were concerned about how little power they have over owning and managing their data on the internet. The idea of the feminist server was described as a project that is interested in “creating a more autonomous infrastructure to ensure that data, projects and memory of feminist groups are properly accessible, preserved and managed” – a safe digital library to store and manage content generated by feminist groups. This was also a direct challenge to the traditionally male-dominated spaces of computer hardware management, spaces which could be very exclusionary and hostile to women or queer individuals who might be interested in learning how to use these technologies. 

There are two related ways by which a server can be considered as feminist. The first is based on who runs the server, and the second is based on who owns the server. Feminist critics have pointed out how the running of servers is often in the hands of male experts who are not keen to share and explain the knowledge required to maintain a server – a role known as a systems admin or, colloquially, a “sysadmin” person. Thus the concept of feminist servers emerged out of a need to challenge patriarchal dominance in hardware and infrastructure spaces, to create alternatives that were nurturing, anti-capitalist, and worked on the basis of community and solidarity…(More)”.