Hachem, Sara et al in Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI): “While the design of smart city ICT systems of today is still largely focused on (and therefore limited to) passive sensing, the emergence of mobile crowd-sensing calls for more active citizen engagement in not only understanding but also shaping of our societies. The Urban Civics Internet of Things (IoT) middleware enables such involvement while effectively closing several feedback loops by including citizens in the decision-making process thus leading to smarter and healthier societies. We present our initial design and planned experimental evaluation of city-scale architecture components where data assimilation, actuation and citizen engagement are key enablers toward democratization of urban data, longer-term transparency, and accountability of urban development policies. All of these are building blocks of smart cities and societies….(More)”
Analyzing 1.1 Billion NYC Taxi and Uber Trips
Todd W. Schneider: “The New York City Taxi & Limousine Commission has released a staggeringly detailed historical dataset covering over 1.1 billion individual taxi trips in the city from January 2009 through June 2015. Taken as a whole, the detailed trip-level data is more than just a vast list of taxi pickup and drop off coordinates: it’s a story of New York. How bad is the rush hour traffic from Midtown to JFK? Where does the Bridge and Tunnel crowd hang out on Saturday nights? What time do investment bankers get to work? How has Uber changed the landscape for taxis? And could Bruce Willis and Samuel L. Jackson have made it from 72nd and Broadway to Wall Street in less than 30 minutes? The dataset addresses all of these questions and many more.
I mapped the coordinates of every trip to local census tracts and neighborhoods, then set about in an attempt to extract stories and meaning from the data. This post covers a lot, but for those who want to pursue more analysis on their own: everything in this post—the data, software, and code—is freely available. Full instructions to download and analyze the data for yourself are available on GitHub.
Table of Contents
Jakarta’s Participatory Budget
Ramda Yanurzha in GovInsider: “…This is a map of Musrenbang 2014 in Jakarta. Red is a no-go, green means the proposal is approved.
To give you a brief background, musrenbang is Indonesia’s flavor of participatory, bottom-up budgeting. The idea is that people can propose any development for their neighbourhood through a multi-stage budgeting process, thus actively participating in shaping the final budget for the city level, which will then determine the allocation for each city at the provincial level, and so on.
The catch is, I’m confident enough to say that not many people (especially in big cities) are actually aware of this process. While civic activists tirelessly lament that the process itself is neither inclusive nor transparent, I’m leaning towards a simpler explanation that most people simply couldn’t connect the dots.
People know that the public works agency fixed that 3-foot pothole last week. But it’s less clear how they can determine who is responsible for fixing a new streetlight in that dark alley and where the money comes from. Someone might have complain to the neighbourhood leader (Pak RT) and somehow the message gets through, but it’s very hard to trace how it got through. Just keep complaining to the black box until you don’t have to. There are very few people (mainly researchers) who get to see the whole picture.
This has now changed because the brand-new Jakarta open data portal provides musrenbang data from 2009. Who proposed what to whom, for how much, where it should be implemented (geotagged!), down to kelurahan/village level, and whether the proposal is accepted into the final city budget. For someone who advocates for better availability of open data in Indonesia and is eager to practice my data wrangling skill, it’s a goldmine.
Diving In

The data is also, as expected, incredibly messy. While surprisingly most of the projects proposed are geotagged, there are a lot of formatting inconsistencies that makes the clean up stage painful. Some of them are minor (m? meter? meter2? m2? meter persegi?) while others are perplexing (latitude: -6,547,843,512,000 – yes, that’s a value of more than a billion). Annoyingly, hundreds of proposals point to the center of the National Monument so it’s not exactly a representative dataset.
For fellow data wranglers, pull requests to improve the data are gladly welcome over here. Ibam generously wrote an RT extractor to yield further location data, and I’m looking into OpenStreetMap RW boundary data to create a reverse geocoder for the points.
A couple hours of scrubbing in OpenRefine yields me a dataset that is clean enough for me to generate the CartoDB map I embedded at the beginning of this piece. More precisely, it is a map of geotagged projects where each point is colored depending on whether it’s rejected or accepted.
Numbers and Patterns
40,511 proposals, some of them merged into broader ones, which gives us a grand total of 26,364 projects valued at over IDR 3,852,162,060,205, just over $250 million at the current exchange rate. This amount represents over 5% of Jakarta’s annual budget for 2015, with projects ranging from a IDR 27,500 (~$2) trash bin (that doesn’t sound right, does it?) in Sumur Batu to IDR 54 billion, 1.5 kilometer drainage improvement in Koja….(More)”
RethinkCityHall.org
Press Release (Boston): “Mayor Martin J. Walsh today announced the launch of RethinkCityHall.org, a website designed to encourage civic participation in the City Hall campus plan study, a one-year comprehensive planning process that will serve as a roadmap for the operation and design improvements to City Hall and the plaza.
This announcement is one of three interrelated efforts that the City is pursuing to reinvigorate and bring new life to both City Hall and City Hall Plaza. As part of the Campus Plan Request for Qualifications (RFQ) that was released on June 8, 2015, the City has selected Utile, a local architecture and planning firm, to partner with the city to lead the campus plan study. Utile is teamed with Grimshaw Architects and Reed Hilderbrand for the design phases of the effort.
“I am excited to have Utile on board as we work to identify ways to activate our civic spaces,” said Mayor Walsh. “As we progress in the planning process, it is important to take inventory of all of our assets to be able to identify opportunities for improvement. This study will help us develop a thoughtful and forward-thinking plan to reimagine City Hall and the plaza as thriving, healthy and innovative civic spaces.”
“We are energized by Mayor Walsh’s challenge and are excited to work with the various constituencies to develop an innovative plan,” said Tim Love, a principal at Utile. “Thinking about the functional, programmatic and experiential aspects of both the building and plaza provides the opportunity to fundamentally rethink City Hall.”
Both the City and Utile are committed to an open and interactive process that engages members of the public, community groups, professional organizations, and as part of that effort the website will include information about stakeholder meetings and public forums. Additionally, the website will be updated on an ongoing basis with the research, analysis, concepts and design scenarios generated by the consultant team….(More)”
How Big Data is Helping to Tackle Climate Change
Bernard Marr at DataInformed: “Climate scientists have been gathering a great deal of data for a long time, but analytics technology’s catching up is comparatively recent. Now that cloud, distributed storage, and massive amounts of processing power are affordable for almost everyone, those data sets are being put to use. On top of that, the growing number of Internet of Things devices we are carrying around are adding to the amount of data we are collecting. And the rise of social media means more and more people are reporting environmental data and uploading photos and videos of their environment, which also can be analyzed for clues.
Perhaps one of the most ambitious projects that employ big data to study the environment is Microsoft’s Madingley, which is being developed with the intention of creating a simulation of all life on Earth. The project already provides a working simulation of the global carbon cycle, and it is hoped that, eventually, everything from deforestation to animal migration, pollution, and overfishing will be modeled in a real-time “virtual biosphere.” Just a few years ago, the idea of a simulation of the entire planet’s ecosphere would have seemed like ridiculous, pie-in-the-sky thinking. But today it’s something into which one of the world’s biggest companies is pouring serious money. Microsoft is doing this because it believes that analytical technology has finally caught up with the ability to collect and store data.
Another data giant that is developing tools to facilitate analysis of climate and ecological data is EMC. Working with scientists at Acadia National Park in Maine, the company has developed platforms to pull in crowd-sourced data from citizen science portals such as eBird and iNaturalist. This allows park administrators to monitor the impact of climate change on wildlife populations as well as to plan and implement conservation strategies.
Last year, the United Nations, under its Global Pulse data analytics initiative, launched the Big Data Climate Challenge, a competition aimed to promote innovate data-driven climate change projects. Among the first to receive recognition under the program is Global Forest Watch, which combines satellite imagery, crowd-sourced witness accounts, and public datasets to track deforestation around the world, which is believed to be a leading man-made cause of climate change. The project has been promoted as a way for ethical businesses to ensure that their supply chain is not complicit in deforestation.
Other initiatives are targeted at a more personal level, for example by analyzing transit routes that could be used for individual journeys, using Google Maps, and making recommendations based on carbon emissions for each route.
The idea of “smart cities” is central to the concept of the Internet of Things – the idea that everyday objects and tools are becoming increasingly connected, interactive, and intelligent, and capable of communicating with each other independently of humans. Many of the ideas put forward by smart-city pioneers are grounded in climate awareness, such as reducing carbon dioxide emissions and energy waste across urban areas. Smart metering allows utility companies to increase or restrict the flow of electricity, gas, or water to reduce waste and ensure adequate supply at peak periods. Public transport can be efficiently planned to avoid wasted journeys and provide a reliable service that will encourage citizens to leave their cars at home.
These examples raise an important point: It’s apparent that data – big or small – can tell us if, how, and why climate change is happening. But, of course, this is only really valuable to us if it also can tell us what we can do about it. Some projects, such as Weathersafe, which helps coffee growers adapt to changing weather patterns and soil conditions, are designed to help humans deal with climate change. Others are designed to tackle the problem at the root, by highlighting the factors that cause it in the first place and showing us how we can change our behavior to minimize damage….(More)”
The promise and perils of predictive policing based on big data
H. V. Jagadish in the Conversation: “Police departments, like everyone else, would like to be more effective while spending less. Given the tremendous attention to big data in recent years, and the value it has provided in fields ranging from astronomy to medicine, it should be no surprise that police departments are using data analysis to inform deployment of scarce resources. Enter the era of what is called “predictive policing.”
Some form of predictive policing is likely now in force in a city near you.Memphis was an early adopter. Cities from Minneapolis to Miami have embraced predictive policing. Time magazine named predictive policing (with particular reference to the city of Santa Cruz) one of the 50 best inventions of 2011. New York City Police Commissioner William Bratton recently said that predictive policing is “the wave of the future.”
The term “predictive policing” suggests that the police can anticipate a crime and be there to stop it before it happens and/or apprehend the culprits right away. As the Los Angeles Times points out, it depends on “sophisticated computer analysis of information about previous crimes, to predict where and when crimes will occur.”
At a very basic level, it’s easy for anyone to read a crime map and identify neighborhoods with higher crime rates. It’s also easy to recognize that burglars tend to target businesses at night, when they are unoccupied, and to target homes during the day, when residents are away at work. The challenge is to take a combination of dozens of such factors to determine where crimes are more likely to happen and who is more likely to commit them. Predictive policing algorithms are getting increasingly good at such analysis. Indeed, such was the premise of the movie Minority Report, in which the police can arrest and convict murderers before they commit their crime.
Predicting a crime with certainty is something that science fiction can have a field day with. But as a data scientist, I can assure you that in reality we can come nowhere close to certainty, even with advanced technology. To begin with, predictions can be only as good as the input data, and quite often these input data have errors.
But even with perfect, error-free input data and unbiased processing, ultimately what the algorithms are determining are correlations. Even if we have perfect knowledge of your troubled childhood, your socializing with gang members, your lack of steady employment, your wacko posts on social media and your recent gun purchases, all that the best algorithm can do is to say it is likely, but not certain, that you will commit a violent crime. After all, to believe such predictions as guaranteed is to deny free will….
What data can do is give us probabilities, rather than certainty. Good data coupled with good analysis can give us very good estimates of probability. If you sum probabilities over many instances, you can usually get a robust estimate of the total.
For example, data analysis can provide a probability that a particular house will be broken into on a particular day based on historical records for similar houses in that neighborhood on similar days. An insurance company may add this up over all days in a year to decide how much to charge for insuring that house….(More)”
E-Gov’s Untapped Potential for Cutting the Public Workforce
Robert D. Atkinson at Governing: “Since the flourishing of the Internet in the mid-1990s, e-government advocates have promised that information technology not only would make it easier to access public services but also would significantly increase government productivity and lower costs. Compared to the private sector, however, this promise has remained largely unfulfilled, in part because of a resistance to employing technology to replace government workers.
It’s not surprising, then, that state budget directors and budget committees usually look at IT as a cost rather than as a strategic investment that can produce a positive financial return for taxpayers. Until governments make a strong commitment to using IT to increase productivity — including as a means of workforce reduction — it will remain difficult to bring government into the 21st-century digital economy.
The benefits can be sizeable. My organization, the Information Technology and Innovation Foundation, estimates that if states focus on using IT to drive productivity, they stand to save more than $11 billion over the next five years. States can achieve these productivity gains in two primary ways:
First, they can use e-government to substitute for person-to-person interactions. For example, by moving just nine state services online — from one-stop business registration to online vehicle-license registration — Utah reduced the need for government employees to interact with citizens, saving an average of $13 per transaction.
And second, they can use IT to optimize performance and cut costs. In 2013, for example, Pennsylvania launched a mobile app to streamline the inspection process for roads and bridges, reducing the time it took for manual data entry. Inspectors saved about 15 minutes per survey, which added up to a savings of over $550,000 in 2013.
So if technology can cut costs, why has e-government not lived up to its original promise? One key reason is that most state governments have focused first and foremost on using IT to improve service quality and access rather than to increase productivity. In part, this is because boosting productivity involves reducing headcount, and state chief information officers and other policymakers often are unwilling to openly advocate for using technology in this way for fear that it will generate opposition from government workers and their unions. This is why replacing labor with modern IT tools has long been the third rail for the public-sector IT community.
This is not necessarily the case in some other nations that have moved to aggressively deploy IT to reduce headcount. The first goal of the Danish Agency for Digitisation’s strategic plan is “a productive and efficient public sector.” To get there, the agency plans to focus on automation of public administrative procedures. Denmark even introduced a rule in which all communications with government need to be done electronically, eliminating telephone receptionists at municipal offices. Likewise, the United Kingdom’s e-government strategy set a goal of increasing productivity by 2.5 percent, including through headcount cuts.
Another reason e-government has not lived up to its full promise is that many state IT systems are woefully out of date, especially compared to the systems the corporate sector uses. But if CIOs and other advocates of modern digital government are going to be able to make their case effectively for resources to bring their technology into the 21st century, they will need to make a more convincing bottom-line case to appropriators. This argument should be about saving money, including through workforce reduction.
Policymakers should base this case not just on savings for government but also for the state’s businesses and citizens….(More)”
Questioning Smart Urbanism: Is Data-Driven Governance a Panacea?
Alice Tang at the Chicago Policy Review: “In the era of data explosion, urban planners are increasingly relying on real-time, streaming data generated by “smart” devices to assist with city management. “Smart cities,” referring to cities that implement pervasive and ubiquitous computing in urban planning, are widely discussed in academia, business, and government. These cities are characterized not only by their use of technology but also by their innovation-driven economies and collaborative, data-driven city governance. Smart urbanism can seem like an effective strategy to create more efficient, sustainable, productive, and open cities. However, there are emerging concerns about the potential risks in the long-term development of smart cities, including political neutrality of big data, technocratic governance, technological lock-ins, data and network security, and privacy risks.
In a study entitled, “The Real-Time City? Big Data and Smart Urbanism,” Rob Kitchin provides a critical reflection on the potential negative effects of data-driven city governance on social development—a topic he claims deserves greater governmental, academic, and social attention.
In contrast to traditional datasets that rely on samples or are aggregated to a coarse scale, “big data” is huge in volume, high in velocity, and diverse in variety. Since the early 2000s, there has been explosive growth in data volume due to the rapid development and implementation of technology infrastructure, including networks, information management, and data storage. Big data can be generated from directed, automated, and volunteered sources. Automated data generation is of particular interest to urban planners. One example Kitchin cites is urban sensor networks, which allow city governments to monitor the movements and statuses of individuals, materials, and structures throughout the urban environment by analyzing real-time data.
With the huge amount of streaming data collected by smart infrastructure, many city governments use real-time analysis to manage different aspects of city operations. There has been a recent trend in centralizing data streams into a single hub, integrating all kinds of surveillance and analytics. These one-stop data centers make it easier for analysts to cross-reference data, spot patterns, identify problems, and allocate resources. The data are also often accessible by field workers via operations platforms. In London and some other cities, real-time data are visualized on “city dashboards” and communicated to citizens, providing convenient access to city information.
However, the real-time city is not a flawless solution to all the problems faced by city managers. The primary concern is the politics of big, urban data. Although raw data are often perceived as neutral and objective, no data are free of bias; the collection of data is a subjective process that can be shaped by various confounding factors. The presentation of data can also be manipulated to answer a specific question or enact a particular political vision….(More)”
‘Democracy vouchers’
Gregory Krieg at CNN: “Democracy vouchers” could be coming to an election near you. Last week, more than 60% of Seattle voters approved the so-called “Honest Elections” measure, or Initiative 122, a campaign finance reform plan offering a novel way of steering public funds to candidates who are willing to swear off big money PACs.
For supporters, the victory — authorizing the use by voters of publicly funded “democracy vouchers” that they can dole out to favored candidates — marks what they hope will be the first step forward in a wide-ranging reform effort spreading to other cities and states in the coming year….
The voucher model also is “a one-two punch” for candidates, Silver said. “They become more dependent on their constituents because their constituents become their funders, and No. 2, they’re part of what I would call a ‘dilution strategy’ — you dilute the space with lots of small-dollar contributions to offset the undue influence of super PACs.”
How “democracy vouchers” work
Beginning next summer, Seattle voters are expected to begin receiving $100 from the city, parceled out in four $25 vouchers, to contribute to local candidates who accept the new law’s restrictions, including not taking funds from PACs, adhering to strict spending caps, and enacting greater transparency. Candidates can redeem the vouchers with the city for real campaign cash, which will likely flow from increased property taxes.
The reform effort began at the grassroots, but morphed into a slickly managed operation that spent nearly $1.4 million, with more than half of that flowing from groups outside the city.
Alan Durning, founder of the nonprofit sustainability think tank Sightline, is an architect of the Seattle initiative. He believes the campaign helped identify a key problem with other reform plans.
“We know that one of the strongest arguments against public funding for campaigns is the idea of giving tax dollars to candidates that you disagree with,” Durning told CNN. “There are a lot of people who hate the idea.”
Currently, most such programs offer to match with public funds small donations for candidates who meet a host of varying requirements. In these cases, taxpayer money goes directly from the government to the campaigns, limiting voters’ connection to the process.
“The benefit of vouchers … is you can think about it as giving the first $100 of your own taxes to the candidate that you prefer,” Durning explained. “Your money is going to the candidate you send it to — so it keeps the choice with the individual voter.”
He added that the use of vouchers can also help the approach appeal to conservative voters, who generally are supportive of voucher-type programs and choice.
But critics call that a misleading argument.
“You’re still taking money from people and giving it to politicians who they may not necessarily want to support,” said Patrick Basham, the founder and director of the Democracy Institute, a libertarian think tank.
“Now, if you, as Voter X, give your four $25 vouchers to Candidate Y, then that’s your choice, but only some of [the money] came from you. It also came from other people.”…(More)”
2015 Digital Cities: Winners Experiment with Forward-Thinking Tech Projects
List of winners at Govtech:
1st place // City of Philadelphia, Pa.
A savvy mix of data-driven citizen engagement, tech modernization and outside-the-box thinking powered Philadelphia to its first-place ranking. A new city websitelaunched last year is designed to provide new levels of user convenience. For instance, three navigation options are squeezed into the top of the site — a search bar, a list of common actions like “report a problem” or “pay a bill,” and a menu of city functions arranged topically — giving citizens multiple ways to find what they need. The site was created using agile principles, launching as a work in progress in December and shaped by user feedback. The city also is broadening its use of open data as a citizen-engagement tool. A new generation of civic apps relies on open data sets to give residents easy access to property tax calculations, property ownership information anddetailed maps of various city resources. These improvements in customer-facing services have been facilitated by upgrades to back-end systems that are improving reliability and reducing staff support requirements. The city estimates that half of its IT systems now are procured as a service. Finally, an interesting pilot involving the city IT department and a local middle school is aimed at drawing more kids into STEM-related careers. Students met weekly in the city Innovation Lab for a series of hands-on experiences led by members of the Philadelphia Office of Information Technology.
2nd place // City of Los Angeles, Calif.
Second-ranked Los Angeles is developing a new model for funding innovative ideas, leveraging private-sector platforms to improve services, streamlining internal processes and closing the broadband gap. The city established a $1 million innovation fund late last year to seed pilot projects generated by city employees’ suggestions. More than a dozen projects have been launched so far. Through open APIs, the city trades traffic information with Google’s Waze traffic app. The app consumes city traffic data to warn drivers about closed roads, hazards and dangerous intersections, while the city transportation department uses information submitted by Waze users to identify potholes, illegal road construction and traffic patterns. MyPayLA, launched by the LA Controller’s Office and the city Information Technology Agency, is a mobile app that lets city employees view their payroll information on a mobile device. And theCityLinkLA broadband initiative is designed to attract broadband providers to the city with expedited permitting and access to existing assets like streetlights, real estate and fiber.
2nd place // City of Louisville, Ky.
Louisville’s mobile-friendly Web portal garnered the city a second-place finish in the Center for Digital Government’s Best of the Web awards earlier this year. Now, Louisville has a No. 2 ranking in the 2015 Digital Cities Survey to add to its trophy case. Besides running an excellent website — built on the open source Drupal platform and hosted in the cloud — Louisville is equipping its entire police force with body-worn cameras and expects to be finished by the end of 2015. Video from 1,000 officers, as well as footage from Metro Watch cameras placed around the city, will be stored in the cloud. Louisville’s Metro Police Department, one of 21 cities involved in the White House Police Data Initiative, also became one of the first in the nation to release data sets on assaulted officers, arrests and citations, and hate crimes on the city’s open data portal. In addition, a public-private partnership called Code Louisville offers free technology training to local residents. More than 500 people have taken 12-week classes to learn Web or mobile development skills.
3rd place // City of Kansas City, Mo.
Kansas City’s Art of Data initiative may be one of the nation’s most creative attempts to engage citizens through open data. The city selected 10 local artists earlier this year to turn information from its open data site into visual art. The artists pulled information from 10 different data sets, ranging from life expectancy by ZIP code to citizen satisfaction with the safety of their neighborhoods. The exhibit drew a large crowd when it opened in June, according to the city, and more than 3,000 residents eventually viewed the works of art. Kansas City also was chosen to participate in a new HUD digital inclusion program called ConnectHome, which will offer broadband access, training, digital literacy programs and devices for residents in assisted housing units. And the city is working with a local startup business, RFP365, to simplify its RFP process. Through a pilot partnership, Kansas City will use the RFP365 platform — which lets buyers track and receive bids from vendors and suppliers — to make the government purchasing process easier and more transparent.
3rd place // City of Phoenix, Ariz.
The development of a new citywide transportation plan in Phoenix offers a great example of how to use digital engagement tools. Using the MindMixer platform, the city developed a website to let citizens suggest ideas for new transit services and street infrastructure, as well as discuss a range of transportation-related issues. Using polling, mapping, open-ended questions and discussion prompts, residents directly helped to develop the plan. The engagement process reached more 3,700 residents and generated hundreds of comments online. In addition, a city-led technology summit held late last year brought together big companies, small businesses and citizens to discuss how technology could improve city operations and boost economic development. And new court technology lets attorneys receive hearing notifications on a mobile device and enables Web and interactive voice response (IVR) payments for a variety of cases.
…(More)”