Website by Sean Hardesty Lewis: “Every ten years, New York City conducts a massive, manual census of its street trees. Thousands of volunteers walk every block with clipboards, counting and identifying every oak and maple across the five boroughs.
They do it because the digital map does not know the trees exist.
To Google or Apple, the city is a grid of addresses and listings. The rest of the world gets flattened. Not because it is invisible, but because it was never entered into a database. The map can tell you where a pharmacy is. It cannot tell you where the fire escapes are. Where the murals are. Where the awnings begin. Where the street trees actually cast shade. Where the scaffolding still hangs.
That is not a New York problem. It is a mapping problem.
We processed hundreds of thousands of Manhattan street view images with a vision language model (VLM). Instead of asking the model for coordinates, we simply asked it to describe what it saw…(More)”.