Chicago uses big data to save itself from urban ills


Aviva Rutkin in the New Scientist: “THIS year in Chicago, some kids will get lead poisoning from the paint or pipes in their homes. Some restaurants will cook food in unsanitary conditions and, here and there, a street corner will be suddenly overrun with rats. These kinds of dangers are hard to avoid in a city of more than 2.5 million people. The problem is, no one knows for certain where or when they will pop up.

The Chicago city government is hoping to change that by knitting powerful predictive models into its everyday city inspections. Its latest project, currently in pilot tests, analyses factors such as home inspection records and census data, and uses the results to guess which buildings are likely to cause lead poisoning in children – a problem that affects around 500,000 children in the US each year. The idea is to identify trouble spots before kids are exposed to dangerous lead levels.

“We are able to prevent problems instead of just respond to them,” says Jay Bhatt, chief innovation officer at the Chicago Department of Public Health. “These models are just the beginning of the use of predictive analytics in public health and we are excited to be at the forefront of these efforts.”

Chicago’s projects are based on the thinking that cities already have what they need to raise their municipal IQ: piles and piles of data. In 2012, city officials built WindyGrid, a platform that collected data like historical facts about buildings and up-to-date streams such as bus locations, tweets and 911 calls. The project was designed as a proof of concept and was never released publicly but it led to another, called Plenario, that allowed the public to access the data via an online portal.

The experience of building those tools has led to more practical applications. For example, one tool matches calls to the city’s municipal hotline complaining about rats with conditions that draw rats to a particular area, such as excessive moisture from a leaking pipe, or with an increase in complaints about garbage. This allows officials to proactively deploy sanitation crews to potential hotspots. It seems to be working: last year, resident requests for rodent control dropped by 15 per cent.

Some predictions are trickier to get right. Charlie Catlett, director of the Urban Center for Computation and Data in Chicago, is investigating an old axiom among city cops: that violent crime tends to spike when there’s a sudden jump in temperature. But he’s finding it difficult to test its validity in the absence of a plausible theory for why it might be the case. “For a lot of things about cities, we don’t have that underlying theory that tells us why cities work the way they do,” says Catlett.

Still, predictive modelling is maturing, as other cities succeed in using it to tackle urban ills….Such efforts can be a boon for cities, making them more productive, efficient and safe, says Rob Kitchin of Maynooth University in Ireland, who helped launched a real-time data site for Dublin last month called the Dublin Dashboard. But he cautions that there’s a limit to how far these systems can aid us. Knowing that a particular street corner is likely to be overrun with rats tomorrow doesn’t address what caused the infestation in the first place. “You might be able to create a sticking plaster or be able to manage it more efficiently, but you’re not going to be able to solve the deep structural problems….”

Traversing Digital Babel


New book by Alon Peled: “The computer systems of government agencies are notoriously complex. New technologies are piled on older technologies, creating layers that call to mind an archaeological dig. Obsolete programming languages and closed mainframe designs offer barriers to integration with other agency systems. Worldwide, these unwieldy systems waste billions of dollars, keep citizens from receiving services, and even—as seen in interoperability failures on 9/11 and during Hurricane Katrina—cost lives. In this book, Alon Peled offers a groundbreaking approach for enabling information sharing among public sector agencies: using selective incentives to “nudge” agencies to exchange information assets. Peled proposes the establishment of a Public Sector Information Exchange (PSIE), through which agencies would trade information.
After describing public sector information sharing failures and the advantages of incentivized sharing, Peled examines the U.S. Open Data program, and the gap between its rhetoric and results. He offers examples of creative public sector information sharing in the United States, Australia, Brazil, the Netherlands, and Iceland. Peled argues that information is a contested commodity, and draws lessons from the trade histories of other contested commodities—including cadavers for anatomical dissection in nineteenth-century Britain. He explains how agencies can exchange information as a contested commodity through a PSIE program tailored to an individual country’s needs, and he describes the legal, economic, and technical foundations of such a program. Touching on issues from data ownership to freedom of information, Peled offers pragmatic advice to politicians, bureaucrats, technologists, and citizens for revitalizing critical information flows.”

The Role Of Open Data In Choosing Neighborhood


PlaceILive Blog: “To what extent is it important to get familiar with our environment?
If we think about how the world surrounding us has changed throughout the years, it is not so unreasonable that, while walking to work, we might encounter some new little shops, restaurants, or gas stations we had never noticed before. Likewise, how many times did we wander about for hours just to find green spaces for a run? And the only one we noticed was even more polluted than other urban areas!
Citizens are not always properly informed about the evolution of the places they live in. And that is why it would be crucial for people to be constantly up-to-date with accurate information of the neighborhood they have chosen or are going to choose.
London is a neat evidence of how transparency in providing data is basic in order to succeed as a Smart City.
The GLA’s London Datastore, for instance, is a public platform of datasets revealing updated figures on the main services offered by the town, in addition to population’s lifestyle and environmental risks. These data are then made more easily accessible to the community through the London Dashboard.
The importance of dispensing free information can be also proved by the integration of maps, which constitute an efficient means of geolocation. Consulting a map where it’s easy to find all the services you need as close as possible can be significant in the search for a location.
Wheel 435
(source: Smart London Plan)
The Open Data Index, published by The Open Knowledge Foundation in 2013, is another useful tool for data retrieval: it showcases a rank of different countries in the world with scores based on openness and availability of data attributes such as transport timetables and national statistics.
Here it is possible to check UK Open Data Census and US City Open Data Census.
As it was stated, making open data available and easily findable online not only represented a success for US cities but favoured apps makers and civic hackers too. Lauren Reid, a spokesperson at Code for America, reported according to Government Technology: “The more data we have, the better picture we have of the open data landscape.”
That is, on the whole, what Place I Live puts the biggest effort into: fostering a new awareness of the environment by providing free information, in order to support citizens willing to choose the best place they can live.
The outcome is soon explained. The website’s homepage offers visitors the chance to type address of their interest, displaying an overview of neighborhood parameters’ evaluation and a Life Quality Index calculated for every point on the map.
The research of the nearest medical institutions, schools or ATMs thus gets immediate and clear, as well as the survey about community’s generic information. Moreover, data’s reliability and accessibility are constantly examined by a strong team of professionals with high competence in data analysis, mapping, IT architecture and global markets.
For the moment the company’s work is focused on London, Berlin, Chicago, San Francisco and New York, while higher goals to reach include more than 200 cities.
US Open Data Census finally saw San Francisco’s highest score achievement as a proof of the city’s labour in putting technological expertise at everyone’s disposal, along with the task of fulfilling users’ needs through meticulous selections of datasets. This challenge seems to be successfully overcome by San Francisco’s new investment, partnering with the University of Chicago, in a data analytics dashboard on sustainability performance statistics named Sustainable Systems Framework, which is expected to be released in beta version by the the end of 2015’s first quarter.
 
Another remarkable collaboration in Open Data’s spread comes from the Bartlett Centre for Advanced Spatial Analysis (CASA) of the University College London (UCL); Oliver O’Brien, researcher at UCL Department of Geography and software developer at the CASA, is indeed one of the contributors to this cause.
Among his products, an interesting accomplishment is London’s CityDashboard, a real-time reports’ control panel in terms of spatial data. The web page also allows to visualize the whole data translated into a simplified map and to look at other UK cities’ dashboards.
Plus, his Bike Share Map is a live global view to bicycle sharing systems in over a hundred towns around the world, since bike sharing has recently drawn a greater public attention as an original form of transportation, in Europe and China above all….”

CC Science → Sensored City


Citizen Sourced Data: “We routinely submit data to others and then worry about liberating the data from the silos. What if we could invert the model? What if collected data were first put into a completely free and open repository accessible to everyone so anyone could build applications with the data? What if the data itself were free so everyone could have an equal opportunity to create and even monetize their creativity? Funded by a generous grant from Robert Wood Johnson Foundation, we intend to do just that.
Partnering with Manylabs, a San Francisco-based sensor tools and education nonprofit, and Urban Matter, Inc., a Brooklyn-based design studio, and in collaboration with the City of Louisville, Kentucky, and Propeller Health, maker of a mobile platform for respiratory health management, we will design, develop and install a network of sensor-based hardware that will collect environmental information at high temporal and spatial scales and store it in a software platform designed explicitly for storing and retrieving such data.
Further, we will design, create and install a public data art installation that will be powered by the data we collect thereby communicating back to the public what has been collected about them.”

Innovation in Philanthropy is not a Hack-a-thon


Sam McAfee in Medium: “…Antiquated funding models and lack of a rapid data-driven evaluation process aren’t the only issues though. Most of the big ideas in the technology-for-social-impact space are focused either on incremental improvements to existing service models, maybe leveraging online services or mobile applications to improve cost-efficiency marginally. Or they solve only a very narrow niche problem for a small audience, often applying a technology that was already in development, and just happened to find a solution in the field.

Innovation Requires Disruption

When you look at innovation in the commercial world, like the Ubers and AirBnBs of the world, what you see is a clear and substantive break from previous modes of thinking about transportation and accommodation. And it’s not the technology itself that is all that impressive. There is nothing ground-breaking technically under the hood of either of those products that wasn’t already lying around for a decade. What makes them different is that they created business models that stepped completely out of the existing taxi and hotel verticals, and simply used technology to leverage existing frustrations with those antiquated models and harness latent demands, to produce a new, vibrant commercial ecosystem.

Now, let’s imagine the same framework in the social sector, where there are equivalent long-standing traditional modes of providing resources. To find new ways of meeting human needs that disrupt those models requires both safe-to-fail experimentation and rapid feedback and iteration in the field, with clear success criteria. Such rapid development can only be accomplished by a sharp, nimble and multifaceted team of thinkers and doers who are passionate about the problem, yes, but also empowered and enabled to break a few institutional eggs on the way to the creative omelet.

Agile and Lean are Proven Methods

It turns out that there are proven working models for cultivating and fostering this kind of innovative thinking and experimentation. As I mentioned above, agile and lean are probably the single greatest contribution to the world by the tech sector, far more impactful than any particular technology produced by it. Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems. They are able to do this precisely because they are unhindered by the hulking bureaucratic structures of the old guard. This is precisely why so many Fortune 500 companies are experimenting with innovation and R&D laboratories. Because they know their existing staff, structures, and processes cannot produce innovation within those constraints. Only the small, nimble teams can do it, and they can only do it if they are kept separate from, protected from even, the traditional production systems of the previous product cycle.

Yet big philanthropy still have barely experimented with this model, only trying it in a few isolated instances. Here at Neo, for example, we are working on a project for teachers funded by a forward-thinking foundation. What our client is trying to disrupt is no less than the entire US education system, and with goals and measurements developed by teachers for teachers, not by Silicon Valley hotshots who have no clue how to fix education.

Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems.

To start with, the project was funded in iterations of six-weeks at a time, each with a distinct and measurable goal. We built a small cross-functional team to tackle some of the tougher issues faced by teachers trying to raise the level of excellence in their classrooms. The team was empowered to talk directly to teachers, and incorporate their feedback into new versions of the project, released on almost a daily basis. We have iterated the design more than sixteen times in less then four months, and it’s starting to really take shape.

We have no idea whether this particular project will be successful in the long run. But what we do know is that the client and their funder have had the courage to step out of the traditional project funding models and apply agile and lean thinking to a very tough problem. And we’re proud to be invited along for the ride.

The vast majority of the social sector is still trying to tackle social problems with program and funding models that were pioneered early in the last century. Agile and lean methods hold the key to finally breaking the mold of the old, traditional model of resourcing social change initiatives. The philanthropic community should be interested in the agile and lean methods produced by the technology sector, not the money produced by it, and start reorganizing project teams and resource allocation strategies and timelines in line this proven innovation model.

Only then we will be in a position to really innovate for social change.”

Canada's Action Plan on Open Government 2014-2016


Draft action plan: “Canada’s second Action Plan on Open Government consists of twelve commitments that will advance open government principles in Canada over the next two years and beyond. The Directive on Open Government, new policy direction to federal departments and agencies on open government, will provide foundational support for each of the additional commitments which fall under three streams: Open Data, Open Information, and Open Dialogue.
Figure 1: Our Commitments
Open Government Directive Diagram

 

More:

Table of Contents

 
 

Killer Apps in the Gigabit Age


New Pew report By , and : “The age of gigabit connectivity is dawning and will advance in coming years. The only question is how quickly it might become widespread. A gigabit connection can deliver 1,000 megabits of information per second (Mbps). Globally, cloud service provider Akamai reports that the average global connection speed in quarter one of 2014 was 3.9 Mbps, with South Korea reporting the highest average connection speed, 23.6 Mbps and the US at 10.5 Mbps.1
In some respects, gigabit connectivity is not a new development. The US scientific community has been using hyper-fast networks for several years, changing the pace of data sharing and enabling levels of collaboration in scientific disciplines that were unimaginable a generation ago.
Gigabit speeds for the “average Internet user” are just arriving in select areas of the world. In the US, Google ran a competition in 2010 for communities to pitch themselves for the construction of the first Google Fiber network running at 1 gigabit per second—Internet speeds 50-100 times faster than the majority of Americans now enjoy. Kansas City was chosen among 1,100 entrants and residents are now signing up for the service. The firm has announced plans to build a gigabit network in Austin, Texas, and perhaps 34 other communities. In response, AT&T has said it expects to begin building gigabit networks in up to 100 US cities.2 The cities of Chattanooga, Tennessee; Lafayette, Louisiana; and Bristol, Virginia, have super speedy networks, and pockets of gigabit connectivity are in use in parts of Las Vegas, Omaha, Santa Monica, and several Vermont communities.3 There are also other regional efforts: Falcon Broadband in Colorado Springs, Colorado; Brooklyn Fiber in New York; Monkey Brains in San Francisco; MINET Fiber in Oregon; Wicked Fiber in Lawrence, Kansas; and Sonic.net in California, among others.4 NewWave expects to launch gigabit connections in 2015 in Poplar Bluff, Missouri Monroe, Rayville, Delhi; and Tallulah, Louisiana, and Suddenlink Communications has launched Operation GigaSpeed.5
In 2014, Google and Verizon were among the innovators announcing that they are testing the capabilities for currently installed fiber networks to carry data even more efficiently—at 10 gigabits per second—to businesses that handle large amounts of Internet traffic.
To explore the possibilities of the next leap in connectivity we asked thousands of experts and Internet builders to share their thoughts about likely new Internet activities and applications that might emerge in the gigabit age. We call this a canvassing because it is not a representative, randomized survey. Its findings emerge from an “opt in” invitation to experts, many of whom play active roles in Internet evolution as technology builders, researchers, managers, policymakers, marketers, and analysts. We also invited comments from those who have made insightful predictions to our previous queries about the future of the Internet. (For more details, please see the section “About this Canvassing of Experts.”)…”

New Technology and the Prevention of Violence and Conflict


Report edited by Francesco Mancini for the International Peace Institute: “In an era of unprecedented interconnectivity, this report explores the ways in which new technologies can assist international actors, governments, and civil society organizations to more effectively prevent violence and conflict. It examines the contributions that cell phones, social media, crowdsourcing, crisis mapping, blogging, and big data analytics can make to short-term efforts to forestall crises and to long-term initiatives to address the root causes of violence.
Five case studies assess the use of such tools in a variety of regions (Africa, Asia, Latin America) experiencing different types of violence (criminal violence, election-related violence, armed conflict, short-term crisis) in different political contexts (restrictive and collaborative governments).
Drawing on lessons and insights from across the cases, the authors outline a how-to guide for leveraging new technology in conflict-prevention efforts:
1. Examine all tools.
2. Consider the context.
3. Do no harm.
4. Integrate local input.
5. Help information flow horizontally.
6. Establish consensus regarding data use.
7. Foster partnerships for better results.”