Can Government Mine Tweets to Assess Public Opinion?


at Government Technology: “What if instead of going to a city meeting, you could go on Twitter, tweet your opinion, and still be heard by those in government? New research suggests this is a possibility.
The Urban Attitudes Lab at Tufts University has conducted research on accessing “big data” on social networking sites for civic purposes, according to Justin Hollander, associate professor in the Department of Urban and Environmental Policy and Planning at Tufts.
About six months ago, Hollander began researching new ways of accessing how people think about the places they live, work and play. “We’re looking to see how tapping into social media data to understand attitudes and opinions can benefit both urban planning and public policy,” he said.
Harnessing natural comments — there are about one billion tweets per day — could help governments learn what people are saying and feeling, said Hollander. And while formal types of data can be used as proxies for how happy people are, people openly share their sentiments on social networking sites.
Twitter and other social media sites can also provide information in an unobtrusive way. “The idea is that we can capture a potentially more valid and reliable view [of people’s] opinions about the world,” he said. As an inexact science, social science relies on a wide range of data sources to inform research, including surveys, interviews and focus groups; but people respond to being the subject of study, possibly affecting outcomes, Hollander said.
Hollander is also interested in extracting data from social sites because it can be done on a 24/7 basis, which means not having to wait for government to administer surveys, like the Decennial Census. Information from Twitter can also be connected to place; Hollander has approximated that about 10 percent of all tweets are geotagged to location.
In its first study earlier this year, the lab looked at using big data to learn about people’s sentiments and civic interests in New Bedford, Mass., comparing Twitter messages with the city’s published meeting minutes.
To extract tweets over a six-week period from February to April, researchers used the lab’s own software to capture 122,186 tweets geotagged within the city that also had words pertaining to the New Bedford area. Hollander said anyone can get API information from Twitter to also mine data from an area as small as a neighborhood containing a couple hundred houses.
Researchers used IBM’s SPSS Modeler software, comparing this to custom-designed software, to leverage a sentiment dictionary of nearly 3,000 words, assigning a sentiment score to each phrase — ranging from -5 for awful feelings to +5 for feelings of elation. The lab did this for the Twitter messages, and found that about 7 percent were positive versus 5.5 percent negative, and correspondingly in the minutes, 1.7 percent were positive and .7 percent negative. In total, about 11,000 messages contained sentiments.
The lab also used NVivo qualitative software to analyze 24 key words in a one-year sample of the city’s meeting minutes. By searching for the same words in Twitter posts, the researchers found that “school,” “health,” “safety,” “parks,” “field” and “children” were used frequently across both mediums.
….
Next up for the lab is a new study contrasting Twitter posts from four Massachusetts cities with the recent election results.

The Next Frontier of Engagement: Civic Innovation Labs


Maayan Dembo at Planetizen: “As described by Clayton Christensen, a professor at the Harvard Business School who developed the term “disruptive innovation,” a successful office for social innovation should employ four main tactics to accomplish its mission. First, governments should invest “in innovations that are developed and identified by citizens outside of government who better understand the problems.” Second, the office should support “‘bottom-up’ initiatives, in preference to ‘trickle-down’ philanthropy—because the societal impact of the former is typically greater.” Third, Christensen argues that the office should utilize impact metrics to measure performance and, finally, that it should also invest in social innovation outside of the non-profit sector.
Los Angeles’ most recent citizen-driven social innovation initiative, the Civic Innovation Lab, is an 11-month project aimed at prototyping new solutions for issues within the city of Los Angeles. It is supported by the HubLA, Learn Do Share, the Los Angeles *City  Tech Bullpen, and Innovate LA, a membership organization within the Los Angeles County Economic Development Corporation. Private and public sector support for such labs, in one of the largest cities in America, is highly unprecedented, and because this initiative in Los Angeles is a new mechanism explicitly supported by the public sector, it warrants a critical check on its motivations and accomplishments. Depending on its success, the Civic Innovation Lab could serve as a model for future municipalities.
The Los Angeles Civic Innovation Lab operates in three main phases: 1) workshops where citizens learn about the possibilities of Open Data and discuss what deep challenges face Los Angeles (called the “Discover, Define, Design” stage), 2) a call for solutions to solve the design challenges brought to light in the first phase, and 3) a six-month accelerator program to prototype selected solutions. I participated in the most recent Civic Innovation Lab session, a three-day workshop concluding the “Discover, Define, Design” phase….”

Good data make better cities


Stephen Goldsmith and Susan Crawford at the Boston Globe: “…Federal laws prevent sharing of information among state workers helping the same family. In one state’s public health agency, workers fighting obesity cannot receive information from another official inside the same agency assigned to a program aimed at fighting diabetes. In areas where citizens are worried about environmental justice, sensors collecting air quality information are feared — because they could monitor the movements of people. Cameras that might provide a crucial clue to the identity of a terrorist are similarly feared because they might capture images of innocent bystanders.
In order for the public to develop confidence that data tools work for its betterment, not against it, we have work to do. Leaders need to establish policies covering data access, retention, security, and transparency. Forensic capacity — to look back and see who had access to what for what reason — should be a top priority in the development of any data system. So too should clear consequences for data misuse by government employees.
If we get this right, the payoffs for democracy will be enormous. Data can provide powerful insights into the equity of public services and dramatically increase the effectiveness of social programs. Existing 311 digital systems can become platforms for citizen engagement rather than just channels for complaints. Government services can be graded by citizens and improved in response to a continuous loop of interaction. Cities can search through anonymized data in a huge variety of databases for correlations between particular facts and desired outcomes and then apply that knowledge to drive toward results — what can a city do to reduce rates of obesity and asthma? What bridges are in need of preventative maintenance? And repurposing dollars from ineffective programs and vendors to interventions that work will help cities be safer, cleaner, and more effective.
The digital revolution has finally reached inside the walls of city hall, making this the best time within living memory to be involved in local government. We believe that doing many small things right using data will build trust, making it more likely that citizens will support their city’s need to do big things — including addressing economic dislocation.
Data rules should genuinely protect individuals, not limit our ability to serve them better. When it comes to data, unreasoning fear is our greatest enemy…”

Cities Find Rewards in Cheap Technologies


Nanette Byrnes at MIT Technology Review: “Cities around the globe, whether rich or poor, are in the midst of a technology experiment. Urban planners are pulling data from inexpensive sensors mounted on traffic lights and park benches, and from mobile apps on citizens’ smartphones, to analyze how their cities really operate. They hope the data will reveal how to run their cities better and improve urban life. City leaders and technology experts say that managing the growing challenges of cities well and affordably will be close to impossible without smart technology.
Fifty-four percent of humanity lives in urban centers, and almost all of the world’s projected population growth over the next three decades will take place in cities, including many very poor cities. Because of their density and often strained infrastructure, cities have an outsize impact on the environment, consuming two-thirds of the globe’s energy and contributing 70 percent of its greenhouse-gas emissions. Urban water systems are leaky. Pollution levels are often extreme.
But cities also contribute most of the world’s economic production. Thirty percent of the world’s economy and most of its innovation are concentrated in just 100 cities. Can technology help manage rapid population expansion while also nurturing cities’ all-important role as an economic driver? That’s the big question at the heart of this Business Report.
Selling answers to that question has become a big business. IBM, Cisco, Hitachi, Siemens, and others have taken aim at this market, publicizing successful examples of cities that have used their technology to tackle the challenges of parking, traffic, transportation, weather, energy use, water management, and policing. Cities already spend a billion dollars a year on these systems, and that’s expected to grow to $12 billion a year or more in the next 10 years.
To justify this kind of outlay, urban technologists will have to move past the test projects that dominate discussions today. Instead, they’ll have to solve some of the profound and growing problems of urban living. Cities leaning in that direction are using various technologies to ease parking, measure traffic, and save water (see “Sensing Santander”), reduce rates of violent crime (see “Data-Toting Cops”), and prepare for ever more severe weather patterns.
There are lessons to be learned, too, from cities whose grandiose technological ideas have fallen short, like the eco-city initiative of Tianjin, China (see “China’s Future City”), which has few residents despite great technology and deep government support.
The streets are similarly largely empty in the experimental high-tech cities of Songdo, South Korea; Masdar City, Abu Dhabi; and Paredes, Portugal, which are being designed to have minimal impact on the environment and offer high-tech conveniences such as solar-powered air-conditioning and pneumatic waste disposal systems instead of garbage trucks. Meanwhile, established cities are taking a much more incremental, less ambitious, and perhaps more workable approach, often benefiting from relatively inexpensive and flexible digital technologies….”

The Reliability of Tweets as a Supplementary Method of Seasonal Influenza Surveillance


New Paper by Ming-Hsiang Tsou et al in the Journal of Medical Internet Research: “Existing influenza surveillance in the United States is focused on the collection of data from sentinel physicians and hospitals; however, the compilation and distribution of reports are usually delayed by up to 2 weeks. With the popularity of social media growing, the Internet is a source for syndromic surveillance due to the availability of large amounts of data. In this study, tweets, or posts of 140 characters or less, from the website Twitter were collected and analyzed for their potential as surveillance for seasonal influenza.
Objective: There were three aims: (1) to improve the correlation of tweets to sentinel-provided influenza-like illness (ILI) rates by city through filtering and a machine-learning classifier, (2) to observe correlations of tweets for emergency department ILI rates by city, and (3) to explore correlations for tweets to laboratory-confirmed influenza cases in San Diego.
Methods: Tweets containing the keyword “flu” were collected within a 17-mile radius from 11 US cities selected for population and availability of ILI data. At the end of the collection period, 159,802 tweets were used for correlation analyses with sentinel-provided ILI and emergency department ILI rates as reported by the corresponding city or county health department. Two separate methods were used to observe correlations between tweets and ILI rates: filtering the tweets by type (non-retweets, retweets, tweets with a URL, tweets without a URL), and the use of a machine-learning classifier that determined whether a tweet was “valid”, or from a user who was likely ill with the flu.
Results: Correlations varied by city but general trends were observed. Non-retweets and tweets without a URL had higher and more significant (P<.05) correlations than retweets and tweets with a URL. Correlations of tweets to emergency department ILI rates were higher than the correlations observed for sentinel-provided ILI for most of the cities. The machine-learning classifier yielded the highest correlations for many of the cities when using the sentinel-provided or emergency department ILI as well as the number of laboratory-confirmed influenza cases in San Diego. High correlation values (r=.93) with significance at P<.001 were observed for laboratory-confirmed influenza cases for most categories and tweets determined to be valid by the classifier.
Conclusions: Compared to tweet analyses in the previous influenza season, this study demonstrated increased accuracy in using Twitter as a supplementary surveillance tool for influenza as better filtering and classification methods yielded higher correlations for the 2013-2014 influenza season than those found for tweets in the previous influenza season, where emergency department ILI rates were better correlated to tweets than sentinel-provided ILI rates. Further investigations in the field would require expansion with regard to the location that the tweets are collected from, as well as the availability of more ILI data…”

A New Taxonomy of Smart City Projects


New paper by Guido Perboli et al: “City logistics proposes an integrated vision of freight transportation systems within urban area and it aims at the optimization of them as a whole in terms of efficiency, security, safety, viability and environmental sustainability. Recently, this perspective has been extended by the Smart City concept in order to include other aspects of city management: building, energy, environment, government, living, mobility, education, health and so on. At the best of our knowledge, a classification of Smart City Projects has not been created yet. This paper introduces such a classification, highlighting success factors and analyzing new trends in Smart City.”

Stories of Innovative Democracy at Local Level


Special Issue of Field Actions Science Reports published in partnership with CIVICUS, coordinated by Dorothée Guénéheux, Clara Bosco, Agnès Chamayou and Henri Rouillé d’Orfeuil: “This special issue presents many and varied field actions, such as the promotion of the rights of young people, the resolution of the conflicts of agropastoral activities, or the process of participatory decisionmaking on community budgetary allocations, among many others. It addresses projects developed all over the world, on five continents, and covering both the northern and southern hemispheres. The legitimate initial queries and doubts that assailed those who started this publication as regards its feasibility, have been swept away by the enthusiasm and the large number of papers that have been sent in….”

 

Spain is trialling city monitoring using sound


Springwise: “There’s more traffic on today’s city streets than there ever has been, and managing it all can prove to be a headache for local authorities and transport bodies. In the past, we’ve seen the City of Calgary in Canada detect drivers’ Bluetooth signals to develop a map of traffic congestion. Now the EAR-IT project in Santander, Spain, is using acoustic sensors to measure the sounds of city streets and determine real time activity on the ground.
Launched as part of the autonomous community’s SmartSantander initiative, the experimental scheme placed hundreds of acoustic processing units around the region. These pick up the sounds being made in any given area and, when processed through an audio recognition engine, can provide data about what’s going on on the street. Smaller ‘motes’ were also developed to provide more accurate location information about each sound.
Created by members of Portugal’s UNINOVA institute and IT consultants EGlobalMark, the system was able to use city noises to detect things such as traffic congestion, parking availability and the location of emergency vehicles based on their sirens. It could then automatically trigger smart signs to display up-to-date information, for example.
The team particularly focused on a junction near the city hospital that’s a hotspot for motor accidents. Rather than force ambulance drivers to risk passing through a red light and into lateral traffic, the sensors were able to detect when and where an emergency vehicle was coming through and automatically change the lights in their favor.
The system could also be used to pick up ‘sonic events’ such as gunshots or explosions and detect their location. The researchers have also trialled an indoor version that can sense if an elderly resident has fallen over or to turn lights off when the room becomes silent.”

Seattle Launches Sweeping, Ethics-Based Privacy Overhaul


for the Privacy Advisor: “The City of Seattle this week launched a citywide privacy initiative aimed at providing greater transparency into the city’s data collection and use practices.
To that end, the city has convened a group of stakeholders, the Privacy Advisory Committee, comprising various government departments, to look at the ways the city is using data collected from practices as common as utility bill payments and renewing pet licenses or during the administration of emergency services like police and fire. By this summer, the committee will deliver the City Council suggested principles and a “privacy statement” to provide direction on privacy practices citywide.
In addition, the city has partnered with the University of Washington, where Jan Whittington, assistant professor of urban design and planning and associate director at the Center for Information Assurance and Cybersecurity, has been given a $50,000 grant to look at open data, privacy and digital equity and how municipal data collection could harm consumers.
Responsible for all things privacy in this progressive city is Michael Mattmiller, who was hired to the position of chief technology officer (CTO) for the City of Seattle in June. Before his current gig, he worked as a senior strategist in enterprise cloud privacy for Microsoft. He said it’s an exciting time to be at the helm of the office because there’s momentum, there’s talent and there’s intention.
“We’re at this really interesting time where we have a City Council that strongly cares about privacy … We have a new police chief who wants to be very good on privacy … We also have a mayor who is focused on the city being an innovative leader in the way we interact with the public,” he said.
In fact, some City Council members have taken it upon themselves to meet with various groups and coalitions. “We have a really good, solid environment we think we can leverage to do something meaningful,” Mattmiller said….
Armbruster said the end goal is to create policies that will hold weight over time.
“I think when looking at privacy principles, from an ethical foundation, the idea is to create something that will last while technology dances around us,” she said, adding the principles should answer the question, “What do we stand for as a city and how do we want to move forward? So any technology that falls into our laps, we can evaluate and tailor or perhaps take a pass on as it falls under our ethical framework.”
The bottom line, Mattmiller said, is making a decision that says something about Seattle and where it stands.
“How do we craft a privacy policy that establishes who we want to be as a city and how we want to operate?” Mattmiller asked.”

Urban Observatory Is Snapping 9,000 Images A Day Of New York City


FastCo-Exist: “Astronomers have long built observatories to capture the night sky and beyond. Now researchers at NYU are borrowing astronomy’s methods and turning their cameras towards Manhattan’s famous skyline.
NYU’s Center for Urban Science and Progress has been running what’s likely the world’s first “urban observatory” of its kind for about a year. From atop a tall building in downtown Brooklyn (NYU won’t say its address, due to security concerns), two cameras—one regular one and one that captures infrared wavelengths—take panoramic images of lower and midtown Manhattan. One photo is snapped every 10 seconds. That’s 8,640 images a day, or more than 3 million since the project began (or about 50 terabytes of data).

“The real power of the urban observatory is that you have this synoptic imaging. By synoptic imaging, I mean these large swaths of the city,” says the project’s chief scientist Gregory Dobler, a former astrophysicist at Harvard University and the University of California, Santa Barbara who now heads the 15-person observatory team at NYU.
Dobler’s team is collaborating with New York City officials on the project, which is now expanding to set up stations that study other parts of Manhattan and Brooklyn. Its major goal is to discover information about the urban landscape that can’t be seen at other scales. Such data could lead to applications like tracking which buildings are leaking energy (with the infrared camera), or measuring occupancy patterns of buildings at night, or perhaps detecting releases of toxic chemicals in an emergency.
The video above is an example. The top panel cycles through a one-minute slice of observatory images. The bottom panel is an analysis of the same images in which everything that remains static in each image is removed, such as buildings, trees, and roads. What’s left is an imprint of everything in flux within the scene—the clouds, the cars on the FDR Drive, the boat moving down the East River, and, importantly, a plume of smoke that puffs out of a building.
“Periodically, a building will burp,” says Dobler. “It’s hard to see the puffs of smoke . . . but we can isolate that plume and essentially identify it.” (As Dobler has done by highlighting it in red in the top panel).
To the natural privacy concerns about this kind of program, Dobler emphasizes that the pictures are only from an 8 megapixel camera (the same found in the iPhone 6) and aren’t clear enough to see inside a window or make out individuals. As a further privacy safeguard, the images are analyzed to only look at “aggregate” measures—such as the patterns of nighttime energy usage—rather than specific buildings. “We’re not really interested in looking at a given building, and saying, hey, these guys are particular offenders,” he says (He also says the team is not looking at uses for the data in security applications.) However, Dobler was not able to answer a question as to whether the project’s partners at city agencies are able to access data analysis for individual buildings….”