Citizen social science in practice: the case of the Empty Houses Project


Paper by Alexandra Albert: “The growth of citizen science and participatory science, where non-professional scientists voluntarily participate in scientific activities, raises questions around the ownership and interpretation of data, issues of data quality and reliability, and new kinds of data literacy. Citizen social science (CSS), as an approach that bridges these fields, calls into question the way in which research is undertaken, as well as who can collect data, what data can be collected, and what such data can be used for. This article outlines a case study—the Empty Houses Project—to explore how CSS plays out in practice, and to reflect on the opportunities and challenges it presents. The Empty Houses Project was set up to investigate how citizens could be mobilised to collect data about empty houses in their local area, so as to potentially contribute towards tackling a pressing policy issue. The study shows how the possibilities of CSS exceed the dominant view of it as a new means of creating data repositories. Rather, it considers how the data produced in CSS is an epistemology, and a politics, not just a realist tool for analysis….(More)”.

The Data Shake: Opportunities and Obstacles for Urban Policy Making


Book edited by Grazia Concilio, Paola Pucci, Lieven Raes and Geert Mareels: “This open access book represents one of the key milestones of PoliVisu, an H2020 research and innovation project funded by the European Commission under the call “Policy-development in the age of big data: data-driven policy-making, policy-modelling and policy-implementation”.

It investigates the operative and organizational implications related to the use of the growing amount of available data on policy making processes, highlighting the experimental dimension of policy making that, thanks to data, proves to be more and more exploitable towards more effective and sustainable decisions.

The first section of the book introduces the key questions highlighted by the PoliVisu project, which still represent operational and strategic challenges in the exploitation of data potentials in urban policy making. The second section explores how data and data visualisations can assume different roles in the different stages of a policy cycle and profoundly transform policy making….(More)”.

Measuring Commuting and Economic Activity Inside Cities with Cell Phone Records


Paper by Gabriel Kreindler and Yuhei Miyauchi: “We show how to use commuting flows to infer the spatial distribution of income within a city. A simple workplace choice model predicts a gravity equation for commuting flows whose destination fixed effects correspond to wages. We implement this method with cell phone transaction data from Dhaka and Colombo. Model-predicted income predicts separate income data, at the workplace and residential level, and by skill group. Unlike machine learning approaches, our method does not require training data, yet achieves comparable predictive power. We show that hartals (transportation strikes) in Dhaka reduce commuting more for high model-predicted wage and high-skill commuters….(More)”.

Leveraging artificial intelligence to analyze citizens’ opinions on urban green space


Paper by Mohammadhossein Ghahramani, Nadina J.Galle, Fábio Duarte, Carlo Ratti, Francesco Pilla: “Continued population growth and urbanization is shifting research to consider the quality of urban green space over the quantity of these parks, woods, and wetlands. The quality of urban green space has been hitherto measured by expert assessments, including in-situ observations, surveys, and remote sensing analyses. Location data platforms, such as TripAdvisor, can provide people’s opinion on many destinations and experiences, including UGS. This paper leverages Artificial Intelligence techniques for opinion mining and text classification using such platform’s reviews as a novel approach to urban green space quality assessments. Natural Language Processing is used to analyze contextual information given supervised scores of words by implementing computational analysis. Such an application can support local authorities and stakeholders in their understanding of–and justification for–future investments in urban green space….(More)”.

How Big Data is Transforming the Way We Plan Our Cities


Paper by Rawad Choubassi and Lamia Abdelfattah: “The availability of ubiquitous location-based data in cities has had far-reaching implications on analytical powers in various disciplines. This article focuses on some of the accrued benefits to urban transport planners and the urban planning field at large. It contends that the gains of Big Data and real-time information has not only improved analytical strength, but has also created ripple effects in the systemic approaches of city planning, integrating ex-post studies within the design cycle and redefining the planning process as a microscopic, iterative and self-correcting process. Case studies from the field are used to further highlight these newfound abilities to process fine-grained analyses and propose more customized location-based solutions, offered by Big Data. A detailed description of the Torrance Living Lab experience maps out some of the potentials of using movement data from Big Data sources to design an alternative mobility plan for a low-density urban area. Finally, the paper reflects on Big Data’s limited capacity at present to replace traditional forecast modelling tools, despite demonstrated advantages over traditional methods in gaining insight from past and present travel trends….(More)”.

How One State Managed to Actually Write Rules on Facial Recognition


Kashmir Hill at The New York Times: “Though police have been using facial recognition technology for the last two decades to try to identify unknown people in their investigations, the practice of putting the majority of Americans into a perpetual photo lineup has gotten surprisingly little attention from lawmakers and regulators. Until now.

Lawmakers, civil liberties advocates and police chiefs have debated whether and how to use the technology because of concerns about both privacy and accuracy. But figuring out how to regulate it is tricky. So far, that has meant an all-or-nothing approach. City Councils in Oakland, Portland, San FranciscoMinneapolis and elsewhere have banned police use of the technology, largely because of bias in how it works. Studies in recent years by MIT researchers and the federal government found that many facial recognition algorithms are most accurate for white men, but less so for everyone else.

At the same time, automated facial recognition has become a powerful investigative tool, helping to identify child molesters and, in a recent high-profile example, people who participated in the Jan. 6 riot at the Capitol. Law enforcement officials in Vermont want the state’s ban lifted because there “could be hundreds of kids waiting to be saved.”

That’s why a new law in Massachusetts is so interesting: It’s not all or nothing. The state managed to strike a balance on regulating the technology, allowing law enforcement to harness the benefits of the tool, while building in protections that might prevent the false arrests that have happened before….(More)”.

N.Y.’s Vaccine Websites Weren’t Working. He Built a New One for $50.


Sharon Otterman at New York Times: “Huge Ma, a 31-year-old software engineer for Airbnb, was stunned when he tried to make a coronavirus vaccine appointment for his mother in early January and saw that there were dozens of websites to check, each with its own sign-up protocol. The city and state appointment systems were completely distinct.

“There has to be a better way,” he said he remembered thinking.

So, he developed one. In less than two weeks, he launched TurboVax, a free website that compiles availability from the three main city and state New York vaccine systems and sends the information in real time to Twitter. It cost Mr. Ma less than $50 to build, yet it offers an easier way to spot appointments than the city and state’s official systems do.

“It’s sort of become a challenge to myself, to prove what one person with time and a little motivation can do,” he said last week. “This wasn’t a priority for governments, which was unfortunate. But everyone has a role to play in the pandemic, and I’m just doing the very little that I can to make it a little bit easier.”

Supply shortages and problems with access to vaccination appointments have been some of the barriers to the equitable distribution of the vaccine in New York City and across the United States, officials have acknowledged….(More)”.

Spatial information and the legibility of urban form: Big data in urban morphology


Paper by Geoff Boeing: “Urban planning and morphology have relied on analytical cartography and visual communication tools for centuries to illustrate spatial patterns, conceptualize proposed designs, compare alternatives, and engage the public. Classic urban form visualizations – from Giambattista Nolli’s ichnographic maps of Rome to Allan Jacobs’s figure-ground diagrams of city streets – have compressed physical urban complexity into easily comprehensible information artifacts. Today we can enhance these traditional workflows through the Smart Cities paradigm of understanding cities via user-generated content and harvested data in an information management context. New spatial technology platforms and big data offer new lenses to understand, evaluate, monitor, and manage urban form and evolution. This paper builds on the theoretical framework of visual cultures in urban planning and morphology to introduce and situate computational data science processes for exploring urban fabric patterns and spatial order. It demonstrates these workflows with OSMnx and data from OpenStreetMap, a collaborative spatial information system and mapping platform, to examine street network patterns, orientations, and configurations in different study sites around the world, considering what these reveal about the urban fabric. The age of ubiquitous urban data and computational toolkits opens up a new era of worldwide urban form analysis from integrated quantitative and qualitative perspectives….(More)”.

The Rise of Urban Commons


Blogpost by Alessandra Quarta and Antonio Vercellone: “In the last ten years, the concept of the commons became popular in social studies and political activism and in some countries domestic lawyers have shared the interest for this notion. Even if an (existing or proposed) statutory definition of the commons is still very rare, lawyers get familiar with the concept of the commons through the filter of property law, where such a concept has been quite discredited. In fact, approaching property law, many students of different legal traditions learn the origins of property rights revolving on the “tragedy of the commons”, the “parable” made famous by Garrett Hardin in the late nineteen-sixties. According to this widespread narrative, the impossibility to avoid the over-exploitation of those resources managed through an open-access regime determines the necessity of allocating private property rights. In this classic argument, the commons appear in a negative light: they represent the impossibility for a community to manage shared resources without concentrating all the decision-making powers in the hand of a single owner or of a central government. Moreover, they represent the wasteful inefficiency of the Feudal World.

This vision has dominated social and economic studies until 1998, when Elinor Ostrom published her famous book Governing the commons, offering the results of her research on resources managed by communities in different parts of the world. Ostrom, awarded with the Nobel Prize in 2009, demonstrated that the commons are not necessarily a tragedy and a place of no-law. In fact, local communities generally define principles for their government and sharing in a resilient way avoiding the tragedy to occur. Moreover, Ostrom defined a set of principles for checking if the commons are managed efficiently and can compete with both private and public arrangements of resource management.

Later on, under an institutional perspective, the commons became the tool of contestation of political and economic mainstream dogmas, including the unquestionable efficiency of both the market and private property in the allocation of resources. The research of new tools for managing resources has been carried out in several experimentations that generally occurs at the local and urban level: scholars and practitioners define these experiences as ‘urban commons’….(More)”.

Nowcasting Gentrification Using Airbnb Data


Paper by Shomik Jain, Davide Proserpio, Giovanni Quattrone, and Daniele Quercia: “There is a rumbling debate over the impact of gentrification: presumed gentrifiers have been the target of protests and attacks in some cities, while they have been welcome as generators of new jobs and taxes in others. Census data fails to measure neighborhood change in real-time since it is usually updated every ten years. This work shows that Airbnb data can be used to quantify and track neighborhood changes. Specifically, we consider both structured data (e.g. number of listings, number of reviews, listing information) and unstructured data (e.g. user-generated reviews processed with natural language processing and machine learning algorithms) for three major cities, New York City (US), Los Angeles (US), and Greater London (UK). We find that Airbnb data (especially its unstructured part) appears to nowcast neighborhood gentrification, measured as changes in housing affordability and demographics. Overall, our results suggest that user-generated data from online platforms can be used to create socioeconomic indices to complement traditional measures that are less granular, not in real-time, and more costly to obtain….(More)”.