Spain is trialling city monitoring using sound


Springwise: “There’s more traffic on today’s city streets than there ever has been, and managing it all can prove to be a headache for local authorities and transport bodies. In the past, we’ve seen the City of Calgary in Canada detect drivers’ Bluetooth signals to develop a map of traffic congestion. Now the EAR-IT project in Santander, Spain, is using acoustic sensors to measure the sounds of city streets and determine real time activity on the ground.
Launched as part of the autonomous community’s SmartSantander initiative, the experimental scheme placed hundreds of acoustic processing units around the region. These pick up the sounds being made in any given area and, when processed through an audio recognition engine, can provide data about what’s going on on the street. Smaller ‘motes’ were also developed to provide more accurate location information about each sound.
Created by members of Portugal’s UNINOVA institute and IT consultants EGlobalMark, the system was able to use city noises to detect things such as traffic congestion, parking availability and the location of emergency vehicles based on their sirens. It could then automatically trigger smart signs to display up-to-date information, for example.
The team particularly focused on a junction near the city hospital that’s a hotspot for motor accidents. Rather than force ambulance drivers to risk passing through a red light and into lateral traffic, the sensors were able to detect when and where an emergency vehicle was coming through and automatically change the lights in their favor.
The system could also be used to pick up ‘sonic events’ such as gunshots or explosions and detect their location. The researchers have also trialled an indoor version that can sense if an elderly resident has fallen over or to turn lights off when the room becomes silent.”

Seattle Launches Sweeping, Ethics-Based Privacy Overhaul


for the Privacy Advisor: “The City of Seattle this week launched a citywide privacy initiative aimed at providing greater transparency into the city’s data collection and use practices.
To that end, the city has convened a group of stakeholders, the Privacy Advisory Committee, comprising various government departments, to look at the ways the city is using data collected from practices as common as utility bill payments and renewing pet licenses or during the administration of emergency services like police and fire. By this summer, the committee will deliver the City Council suggested principles and a “privacy statement” to provide direction on privacy practices citywide.
In addition, the city has partnered with the University of Washington, where Jan Whittington, assistant professor of urban design and planning and associate director at the Center for Information Assurance and Cybersecurity, has been given a $50,000 grant to look at open data, privacy and digital equity and how municipal data collection could harm consumers.
Responsible for all things privacy in this progressive city is Michael Mattmiller, who was hired to the position of chief technology officer (CTO) for the City of Seattle in June. Before his current gig, he worked as a senior strategist in enterprise cloud privacy for Microsoft. He said it’s an exciting time to be at the helm of the office because there’s momentum, there’s talent and there’s intention.
“We’re at this really interesting time where we have a City Council that strongly cares about privacy … We have a new police chief who wants to be very good on privacy … We also have a mayor who is focused on the city being an innovative leader in the way we interact with the public,” he said.
In fact, some City Council members have taken it upon themselves to meet with various groups and coalitions. “We have a really good, solid environment we think we can leverage to do something meaningful,” Mattmiller said….
Armbruster said the end goal is to create policies that will hold weight over time.
“I think when looking at privacy principles, from an ethical foundation, the idea is to create something that will last while technology dances around us,” she said, adding the principles should answer the question, “What do we stand for as a city and how do we want to move forward? So any technology that falls into our laps, we can evaluate and tailor or perhaps take a pass on as it falls under our ethical framework.”
The bottom line, Mattmiller said, is making a decision that says something about Seattle and where it stands.
“How do we craft a privacy policy that establishes who we want to be as a city and how we want to operate?” Mattmiller asked.”

Urban Observatory Is Snapping 9,000 Images A Day Of New York City


FastCo-Exist: “Astronomers have long built observatories to capture the night sky and beyond. Now researchers at NYU are borrowing astronomy’s methods and turning their cameras towards Manhattan’s famous skyline.
NYU’s Center for Urban Science and Progress has been running what’s likely the world’s first “urban observatory” of its kind for about a year. From atop a tall building in downtown Brooklyn (NYU won’t say its address, due to security concerns), two cameras—one regular one and one that captures infrared wavelengths—take panoramic images of lower and midtown Manhattan. One photo is snapped every 10 seconds. That’s 8,640 images a day, or more than 3 million since the project began (or about 50 terabytes of data).

“The real power of the urban observatory is that you have this synoptic imaging. By synoptic imaging, I mean these large swaths of the city,” says the project’s chief scientist Gregory Dobler, a former astrophysicist at Harvard University and the University of California, Santa Barbara who now heads the 15-person observatory team at NYU.
Dobler’s team is collaborating with New York City officials on the project, which is now expanding to set up stations that study other parts of Manhattan and Brooklyn. Its major goal is to discover information about the urban landscape that can’t be seen at other scales. Such data could lead to applications like tracking which buildings are leaking energy (with the infrared camera), or measuring occupancy patterns of buildings at night, or perhaps detecting releases of toxic chemicals in an emergency.
The video above is an example. The top panel cycles through a one-minute slice of observatory images. The bottom panel is an analysis of the same images in which everything that remains static in each image is removed, such as buildings, trees, and roads. What’s left is an imprint of everything in flux within the scene—the clouds, the cars on the FDR Drive, the boat moving down the East River, and, importantly, a plume of smoke that puffs out of a building.
“Periodically, a building will burp,” says Dobler. “It’s hard to see the puffs of smoke . . . but we can isolate that plume and essentially identify it.” (As Dobler has done by highlighting it in red in the top panel).
To the natural privacy concerns about this kind of program, Dobler emphasizes that the pictures are only from an 8 megapixel camera (the same found in the iPhone 6) and aren’t clear enough to see inside a window or make out individuals. As a further privacy safeguard, the images are analyzed to only look at “aggregate” measures—such as the patterns of nighttime energy usage—rather than specific buildings. “We’re not really interested in looking at a given building, and saying, hey, these guys are particular offenders,” he says (He also says the team is not looking at uses for the data in security applications.) However, Dobler was not able to answer a question as to whether the project’s partners at city agencies are able to access data analysis for individual buildings….”

Governing the Smart, Connected City


Blog by Susan Crawford at HBR: “As politics at the federal level becomes increasingly corrosive and polarized, with trust in Congress and the President at historic lows, Americans still celebrate their cities. And cities are where the action is when it comes to using technology to thicken the mesh of civic goods — more and more cities are using data to animate and inform interactions between government and citizens to improve wellbeing.
Every day, I learn about some new civic improvement that will become possible when we can assume the presence of ubiquitous, cheap, and unlimited data connectivity in cities. Some of these are made possible by the proliferation of smartphones; others rely on the increasing number of internet-connected sensors embedded in the built environment. In both cases, the constant is data. (My new book, The Responsive City, written with co-author Stephen Goldsmith, tells stories from Chicago, Boston, New York City and elsewhere about recent developments along these lines.)
For example, with open fiber networks in place, sending video messages will become as accessible and routine as sending email is now. Take a look at rhinobird.tv, a free lightweight, open-source video service that works in browsers (no special download needed) and allows anyone to create a hashtag-driven “channel” for particular events and places. A debate or protest could be viewed from a thousand perspectives. Elected officials and public employees could easily hold streaming, virtual town hall meetings.
Given all that video and all those livestreams, we’ll need curation and aggregation to make sense of the flow. That’s why visualization norms, still in their infancy, will become a greater part of literacy. When the Internet Archive attempted late last year to “map” 400,000 hours of television news, against worldwide locations, it came up with pulsing blobs of attention. Although visionary Kevin Kelly has been talking about data visualization as a new form of literacy for years, city governments still struggle with presenting complex and changing information in standard, easy-to-consume ways.
Plenar.io is one attempt to resolve this. It’s a platform developed by former Chicago Chief Data Officer Brett Goldstein that allows public datasets to be combined and mapped with easy-to-see relationships among weather and crime, for example, on a single city block. (A sample question anyone can ask of Plenar.io: “Tell me the story of 700 Howard Street in San Francisco.”) Right now, Plenar.io’s visual norm is a map, but it’s easy to imagine other forms of presentation that could become standard. All the city has to do is open up its widely varying datasets…”

City slicker


The Economist on how “Data are slowly changing the way cities operate…WAITING for a bus on a drizzly winter morning is miserable. But for London commuters Citymapper, an app, makes it a little more bearable. Users enter their destination into a search box and a range of different ways to get there pop up, along with real-time information about when a bus will arrive or when the next Tube will depart. The app is an example of how data are changing the way people view and use cities. Local governments are gradually starting to catch up.
Nearly all big British cities have started to open up access to their data. On October 23rd the second version of the London Datastore, a huge trove of information on everything from crime statistics to delays on the Tube, was launched. In April Leeds City council opened an online “Data Mill” which contains raw data on such things as footfall in the city centre, the number of allotment sites or visits to libraries. Manchester also releases chunks of data on how the city region operates.
Mostly these websites act as tools for developers and academics to play around with. Since the first Datastore was launched in 2010, around 200 apps, such as Citymapper, have sprung up. Other initiatives have followed. “Whereabouts”, which also launched on October 23rd, is an interactive map by the Future Cities Catapult, a non-profit group, and the Greater London Authority (GLA). It uses 235 data sets, some 150 of them from the Datastore, from the age and occupation of London residents to the number of pubs or types of restaurants in an area. In doing so it suggests a different picture of London neighbourhoods based on eight different categories (see map, and its website: whereaboutslondon.org)….”

Ten Leaders In the Civic Space


List Developed by SeeClickFix:

1. Granicus

Granicus is the leading provider of government webcasting and public meeting software, maintaining the world’s largest network of legislative content…
Read another article about them here.
And, here on their website.

2. Socrata

Socrata is a cloud software company that aims to democratize access to government data through their open data and open performance platform….
Read another article about them here.
And, here on their website.

3. CityWorks

Cityworks is the leading provider of GIS-centric asset management solutions, performing cost-effective inspection, monitoring, and condition assessment.

Read another article about them here.
And, here on their website.

4. NeighborWorks

NeighborWorks is a community development hub that supports more than 240 U.S. development organizations through grants and technical assistance.
Read another article about them here.
And, here on their website.

5. OpenGov Hub

The OpenGov Hub seeks to bring together existing small and medium-sized organizations working on the broader open government agenda. …
Learn more about them here on their website.

6. Blexting

Blexting is a mobile app that lets individuals photographically survey properties and update condition information for posting and sharing. …
Read another article about them here.

7. Code For America

Code for America aims to forge connections between the public and private sector by organizing a network of people to build technology that make government services better….
Read a recent news piece about Code for America here.
And, here on their website.

8. NationBuilder

NationBuilder is a cost-effective, accessible software platform that helps communities organize and people build relationships.
Read another article about them here.
And, here on their website.

9. Emerging Local Government Leaders

ELGL is a group of innovative local government leaders who are hungry to make an impact. …

Learn more about them here on their website.

10. ArchiveSocial

ArchiveSocial is a social media archiving solution that automates record keeping from social media networks like Facebook and Twitter. ….
Learn more about them here on their website.”

Creating community data tools for local impact: Piton Foundation


Amy Gahran at Knight Digital Media Center: “Many local funders focus on providing grants and doing fundraising to support local organizations and projects. However, some local funders also directly operate their own programs to serve the local community.
This November in the Denver metro area, the Piton Foundation will launch a major revamp of their Community Factsonline database of neighborhood-level community statistics. This resource is designed to help Denver-area nonprofits, researchers, community organizers and others better understand and serve communities in need.
Although Piton serves Denver-area communities, it’s not a community foundation in the traditional sense. Piton is a private, operating foundation established in the 1970s with a mission to improve the lives of Colorado’s low-income children and their families. Recently, Piton became part of Gary Community Investments(GCI) — an organization that invests directly in for-profit and philanthropic organizations, projects and programs to benefit Colorado’s low-income children and their families. Originally Piton focused on serving only the city and county of Denver, but since Piton joined GCI, they’ve expanded their focus to serving communities across the entire seven-county Denver metro area.
Back in 1991, Piton took the then-unusual step of forming its own Data Initiative. This early “open government” effort gained access to local datasets, cleaned them up, and made this data available to organizations serving low-income Denver communities. By opening and democratizing local data, and putting it in the context of local neighborhoods, Piton empowered local organizations to do more for local families in need. Such data can help organizations better target programs or fundraising efforts, or better understand local needs and trends in ways that enhance the services they offer.
In 2011, Piton’s Data Initiative expanded, adding projects like the Colorado Data Engine — an open source online data repository of neighborhood-scale public data in a standardized, geolocated format. The development of this data engine was supported by funding from the Knight Foundation and the Denver Foundation….”

ShareHub: at the Heart of Seoul's Sharing Movement


Cat Johnson at Shareable: “In 2012, Seoul publicly announced its commitment to becoming a sharing city. It has since emerged as a leader of the global sharing movement and serves as a model for cities around the world. Supported by the municipal government and embedded in numerous parts of everyday life in Seoul, the Sharing City project has proven to be an inspiration to city leaders, entrepreneurs, and sharing enthusiasts around the world.
At the heart of Sharing City, Seoul is ShareHub, an online platform that connects users with sharing services, educates and informs the public about sharing initiatives, and serves as the online hub for the Sharing City, Seoul project. Now a year and a half into its existence, ShareHub, which is powered by Creative Commons Korea (CC Korea), has served 1.4 million visitors since launching, hosts more than 350 articles about sharing, and has played a key role in promoting sharing policies and projects. Shareable connected with Nanshil Kwon, manager of ShareHub, to find out more about the project, its role in promoting sharing culture, and the future of the sharing movement in Seoul….”

VoteATX


PressRelease: “Local volunteers have released a free application that helps Austin area residents find the best place to vote. The application, Vote ATX, is available at http://voteatx.us
Travis County voters have many options for voting. The Vote ATX application tries to answer the simple question, “Where is the best place I can go vote right now?” The application is location and calendar aware, and helps identify available voting places – even mobile voting locations that move during the day.
The City of Austin has incorporated the Vote ATX technology to power the voting place finder on its election page at http://www.austintexas.gov/vote
The Vote ATX application was developed by volunteers at Open Austin, and is provided as a free public service. …Open Austin is a citizen volunteer group that promotes open government, open data, and civic application development in Austin, Texas. Open Austin was formed in 2009 by citizens interested in the City of Austin web strategy. Open Austin is non-partisan and non-endorsing. It has conducted voter outreach campaigns in every City of Austin municipal election since 2011. Open Austin is on the web at www.open-austin.org

Mapping the Age of Every Building in Manhattan


Kriston Capps at CityLab: “The Harlem Renaissance was the epicenter of new movements in dance, poetry, painting, and literature, and its impact still registers in all those art forms. If you want to trace the Harlem Renaissance, though, best look to Harlem itself.
Many if not most of the buildings in Harlem today rose between 1900 and 1940—and a new mapping tool called Urban Layers reveals exactly where and when. Harlem boasts very few of the oldest buildings in Manhattan today, but it does represent the island’s densest concentration of buildings constructed during the Great Migration.
Thanks to Morphocode‘s Urban Layers, it’s possible to locate nearly every 19th-century building still standing in Manhattan today. That’s just one of the things that you can isolate with the map, which combines two New York City building datasets (PLUTO and Building Footprints) and Mapbox GL JS vector technology to generate an interactive architectural history.
So, looking specifically at Harlem again (with some of the Upper West Side thrown in for good measure), it’s easy to see that very few of the buildings that went up between 1765 to 1860 still stand today….”