The Economist: “WITH at least 4,500 people dead, public-health authorities in west Africa and worldwide are struggling to contain Ebola. Borders have been closed, air passengers screened, schools suspended. But a promising tool for epidemiologists lies unused: mobile-phone data.
When people make mobile-phone calls, the network generates a call data record (CDR) containing such information as the phone numbers of the caller and receiver, the time of the call and the tower that handled it—which gives a rough indication of the device’s location. This information provides researchers with an insight into mobility patterns. Indeed phone companies use these data to decide where to build base stations and thus improve their networks, and city planners use them to identify places to extend public transport.
But perhaps the most exciting use of CDRs is in the field of epidemiology. Until recently the standard way to model the spread of a disease relied on extrapolating trends from census data and surveys. CDRs, by contrast, are empirical, immediate and updated in real time. You do not have to guess where people will flee to or move. Researchers have used them to map malaria outbreaks in Kenya and Namibia and to monitor the public response to government health warnings during Mexico’s swine-flu epidemic in 2009. Models of population movements during a cholera outbreak in Haiti following the earthquake in 2010 used CDRs and provided the best estimates of where aid was most needed.
Doing the same with Ebola would be hard: in west Africa most people do not own a phone. But CDRs are nevertheless better than simulations based on stale, unreliable statistics. If researchers could track population flows from an area where an outbreak had occurred, they could see where it would be likeliest to break out next—and therefore where they should deploy their limited resources. Yet despite months of talks, and the efforts of the mobile-network operators’ trade association and several smaller UN agencies, telecoms firms have not let researchers use the data (see article).
One excuse is privacy, which is certainly a legitimate worry, particularly in countries fresh from civil war, or where tribal tensions exist. But the phone data can be anonymised and aggregated in a way that alleviates these concerns. A bigger problem is institutional inertia. Big data is a new field. The people who grasp the benefits of examining mobile-phone usage tend to be young, and lack the clout to free them for research use.”
From the smart city to the wise city: The role of universities in place-based leadership
Paper by Hambleton, R.: “For a variety of reasons the notion of the smart city has grown in popularity and some even claim that all cities now have to be ‘smart’. For example, some digital enthusiasts argue that advances in Information and Communication Technologies (ICT) are ushering in a new era in which pervasive electronic connections will inevitably lead to significant changes that make cities more liveable and more democratic. This paper will cast a critical eye over these claims. It will unpack the smart city rhetoric and show that, in fact, three competing perspectives are struggling for ascendancy within the smart cities discourse: 1) The digital city (emphasising a strong commitment to the use of ICT in governance), 2) The green city (reflecting the growing use of the US phrase smart growth, which is concerned to apply sound urban planning principles), and 3) The learning city (emphasising the way in which cities learn, network and innovate). Five digital danger zones will be identified and discussed. This analysis will suggest that scholars and policy makers who wish to improve the quality of life in cities should focus their attention on wisdom, not smartness. Civic leaders need to exercise judgement based on values if they are to create inclusive, sustainable cities. It is not enough to be clever, quick, ingenious, nor will it help if Big Data is superseded by Even Bigger Data. Universities can play a much more active role in place-based leadership in the cities where they are located. To do this effectively they need to reconsider the nature of modern scholarship. The paper will show how a growing number of universities are doing precisely this. Two respected examples will be presented to show how urban universities, if they are committed to engaged scholarship, can make a significant contribution to the creation of the wise city.”
Privacy Identity Innovation: Innovator Spotlight
pii2014: “Every year, we invite a select group of startup CEOs to present their technologies on stage at Privacy Identity Innovation as part of the Innovator Spotlight program. This year’s conference (pii2014) is taking place November 12-14 in Silicon Valley, and we’re excited to announce that the following eight companies will be participating in the pii2014 Innovator Spotlight:
* BeehiveID – Led by CEO Mary Haskett, BeehiveID is a global identity validation service that enables trust by identifying bad actors online BEFORE they have a chance to commit fraud.
* Five – Led by CEO Nikita Bier, Five is a mobile chat app crafted around the experience of a house party. With Five, you can browse thousands of rooms and have conversations about any topic.
* Glimpse – Led by CEO Elissa Shevinsky, Glimpse is a private (disappearing) photo messaging app just for groups.
* Humin – Led by CEO Ankur Jain, Humin is a phone and contacts app designed to think about people the way you naturally do by remembering the context of your relationships and letting you search them the way you think.
* Kpass – Led by CEO Dan Nelson, Kpass is an identity platform that provides brands, apps and developers with an easy-to-implement technology solution to help manage the notice and consent requirements for the Children’s Online Privacy Protection Act (COPPA) laws.
* Meeco – Led by CEO Katryna Dow, Meeco is a Life Management Platform that offers an all-in-one solution for you to transact online, collect your own personal data, and be more anonymous with greater control over your own privacy.
* TrustLayers – Led by CEO Adam Towvim, TrustLayers is privacy intelligence for big data. TrustLayers enables confident use of personal data, keeping companies secure in the knowledge that the organization team is following the rules.
* Virtru – Led by CEO John Ackerly, Virtru is the first company to make email privacy accessible to everyone. With a single plug-in, Virtru empowers individuals and businesses to control who receives, reviews, and retains their digital information — wherever it travels, throughout its lifespan.
Learn more about the startups on the Innovator Spotlight page…”
Chicago uses big data to save itself from urban ills
Aviva Rutkin in the New Scientist: “THIS year in Chicago, some kids will get lead poisoning from the paint or pipes in their homes. Some restaurants will cook food in unsanitary conditions and, here and there, a street corner will be suddenly overrun with rats. These kinds of dangers are hard to avoid in a city of more than 2.5 million people. The problem is, no one knows for certain where or when they will pop up.
The Chicago city government is hoping to change that by knitting powerful predictive models into its everyday city inspections. Its latest project, currently in pilot tests, analyses factors such as home inspection records and census data, and uses the results to guess which buildings are likely to cause lead poisoning in children – a problem that affects around 500,000 children in the US each year. The idea is to identify trouble spots before kids are exposed to dangerous lead levels.
“We are able to prevent problems instead of just respond to them,” says Jay Bhatt, chief innovation officer at the Chicago Department of Public Health. “These models are just the beginning of the use of predictive analytics in public health and we are excited to be at the forefront of these efforts.”
Chicago’s projects are based on the thinking that cities already have what they need to raise their municipal IQ: piles and piles of data. In 2012, city officials built WindyGrid, a platform that collected data like historical facts about buildings and up-to-date streams such as bus locations, tweets and 911 calls. The project was designed as a proof of concept and was never released publicly but it led to another, called Plenario, that allowed the public to access the data via an online portal.
The experience of building those tools has led to more practical applications. For example, one tool matches calls to the city’s municipal hotline complaining about rats with conditions that draw rats to a particular area, such as excessive moisture from a leaking pipe, or with an increase in complaints about garbage. This allows officials to proactively deploy sanitation crews to potential hotspots. It seems to be working: last year, resident requests for rodent control dropped by 15 per cent.
Some predictions are trickier to get right. Charlie Catlett, director of the Urban Center for Computation and Data in Chicago, is investigating an old axiom among city cops: that violent crime tends to spike when there’s a sudden jump in temperature. But he’s finding it difficult to test its validity in the absence of a plausible theory for why it might be the case. “For a lot of things about cities, we don’t have that underlying theory that tells us why cities work the way they do,” says Catlett.
Still, predictive modelling is maturing, as other cities succeed in using it to tackle urban ills….Such efforts can be a boon for cities, making them more productive, efficient and safe, says Rob Kitchin of Maynooth University in Ireland, who helped launched a real-time data site for Dublin last month called the Dublin Dashboard. But he cautions that there’s a limit to how far these systems can aid us. Knowing that a particular street corner is likely to be overrun with rats tomorrow doesn’t address what caused the infestation in the first place. “You might be able to create a sticking plaster or be able to manage it more efficiently, but you’re not going to be able to solve the deep structural problems….”
Training Students to Extract Value from Big Data
The nation’s ability to make use of data depends heavily on the availability of a workforce that is properly trained and ready to tackle high-need areas. Training students to be capable in exploiting big data requires experience with statistical analysis, machine learning, and computational infrastructure that permits the real problems associated with massive data to be revealed and, ultimately, addressed. Analysis of big data requires cross-disciplinary skills, including the ability to make modeling decisions while balancing trade-offs between optimization and approximation, all while being attentive to useful metrics and system robustness. To develop those skills in students, it is important to identify whom to teach, that is, the educational background, experience, and characteristics of a prospective data-science student; what to teach, that is, the technical and practical content that should be taught to the student; and how to teach, that is, the structure and organization of a data-science program.
Training Students to Extract Value from Big Data summarizes a workshop convened in April 2014 by the National Research Council’s Committee on Applied and Theoretical Statistics to explore how best to train students to use big data. The workshop explored the need for training and curricula and coursework that should be included. One impetus for the workshop was the current fragmented view of what is meant by analysis of big data, data analytics, or data science. New graduate programs are introduced regularly, and they have their own notions of what is meant by those terms and, most important, of what students need to know to be proficient in data-intensive work. This report provides a variety of perspectives about those elements and about their integration into courses and curricula…”
Ebola: Can big data analytics help contain its spread?
BBC News: “While emergency response teams, medical charities and non-governmental organisations struggle to contain the virus, could big data analytics help?
A growing number of data scientists believe so….
Mobile phones, widely owned in even the poorest countries in Africa, are proving to be a rich source of data in a region where other reliable sources are sorely lacking.
Orange Telecom in Senegal handed over anonymised voice and text data from 150,000 mobile phones to Flowminder, a Swedish non-profit organisation, which was then able to draw up detailed maps of typical population movements in the region.
Authorities could then see where the best places were to set up treatment centres, and more controversially, the most effective ways to restrict travel in an attempt to contain the disease.
The drawback with this data was that it was historic, when authorities really need to be able to map movements in real time. People’s movements tend to change during an epidemic.
This is why the US Centers for Disease Control and Prevention (CDC) is also collecting mobile phone mast activity data from mobile operators and mapping where calls to helplines are mostly coming from.

A sharp increase in calls to a helpline from one particular area would suggest an outbreak and alert authorities to direct more resources there.
Mapping software company Esri is helping CDC to visualise this data and overlay other existing sources of data from censuses to build up a richer picture.
The level of activity at each mobile phone mast also gives a kind of heatmap of where people are and crucially, where and how far they are moving.
“We’ve never had this large-scale, anonymised mobile phone data before as a species,” says Nuria Oliver, a scientific director at mobile phone company Telefonica.
“The most positive impact we can have is to help emergency relief organisations and governments anticipate how a disease is likely to spread.
“Until now they had to rely on anecdotal information, on-the-ground surveys, police and hospital reports.”…
Big Thinkers. Big Data. Big Opportunity: Announcing The LinkedIn Economic Graph Challeng
The LinkedIn Economic Graph Challenge is an idea that emerged from the development of the Economic Graph, a digital mapping of the global economy, comprised of a profile for every professional, company, job opportunity, the skills required to obtain those opportunities, every higher education organization, and all the professionally relevant knowledge associated with each of these entities. With these elements in place, we can connect talent with opportunity at massive scale.
We are launching the LinkedIn Economic Graph Challenge to encourage researchers, academics, and data-driven thinkers to propose how they would use data from LinkedIn to solve some of the most challenging economic problems of our times. We invite anyone who is interested to submit your most innovative, ambitious ideas. In return, we will recognize the three strongest proposals for using data from LinkedIn to generate a positive impact on the global economy, and present the team and/or individual with a $25,000 (USD) research award and the resources to complete their proposed research, with the potential to have it published….
We look forward to your submissions! For more information, please visit the LinkedIn Economic Graph Challenge website….”
New Technology and the Prevention of Violence and Conflict
Report edited by Francesco Mancini for the International Peace Institute: “In an era of unprecedented interconnectivity, this report explores the ways in which new technologies can assist international actors, governments, and civil society organizations to more effectively prevent violence and conflict. It examines the contributions that cell phones, social media, crowdsourcing, crisis mapping, blogging, and big data analytics can make to short-term efforts to forestall crises and to long-term initiatives to address the root causes of violence.
Five case studies assess the use of such tools in a variety of regions (Africa, Asia, Latin America) experiencing different types of violence (criminal violence, election-related violence, armed conflict, short-term crisis) in different political contexts (restrictive and collaborative governments).
Drawing on lessons and insights from across the cases, the authors outline a how-to guide for leveraging new technology in conflict-prevention efforts:
1. Examine all tools.
2. Consider the context.
3. Do no harm.
4. Integrate local input.
5. Help information flow horizontally.
6. Establish consensus regarding data use.
7. Foster partnerships for better results.”
The Web Observatory: A Middle Layer for Broad Data
New paper by Tiropanis Thanassis, Hall Wendy, Hendler James, and de Larrinaga Christian in Big Data: “The Web Observatory project1 is a global effort that is being led by the Web Science Trust,2 its network of WSTnet laboratories, and the wider Web Science community. The goal of this project is to create a global distributed infrastructure that will foster communities exchanging and using each other’s web-related datasets as well as sharing analytic applications for research and business web applications.3 It will provide the means to observe the digital planet, explore its processes, and understand their impact on different sectors of human activity.
The project is creating a network of separate web observatories, collections of datasets and tools for analyzing data about the Web and its use, each with their own use community. This allows researchers across the world to develop and share data, analytic approaches, publications related to their datasets, and tools (Fig. 1). The network of web observatories aims to bridge the gap that currently exists between big data analytics and the rapidly growing web of “broad data,”4 making it difficult for a large number of people to engage with them….”
The View From Your Window Is Worth Cash to This Company
Eric Jaffe in Atlantic CityLab: “A city window overlooking the street has always been a score in its own right, what with so many apartments stuck opening onto back alleys and dumpsters and fire escapes. And now, a company wants to straight up monetize the view. New York startup Placemeter is paying city residents up to $50 a month for street views captured via old smartphones. The idea is to quantify sidewalk life in the service of making the city a more efficient place.
“Measuring data about how the city moves in real time, being able to make predictions on that, is definitely a good way to help cities work better,” says founder Alex Winter. “That’s the vision of Placemeter—to build a data platform where anyone at any time can know how busy the city is, and use that.”
Here’s how it works: City residents send Placemeter a little information about where they live and what they see from their window. In turn, Placemeter sends participants a kit (complete with window suction cup) to convert their unused smartphone into a street sensor, and agrees to pay cash so long as the device stays on and collects data. The more action outside—the more shops, pedestrians, traffic, and public space—the more the view is worth.
On the back end, Placemeter converts the smartphone images into statistical data using proprietary computer vision. The company first detects moving objects (the green splotches in the video below) and classifies them either as people or as 11 types of vehicles or other common urban elements, such as food carts. A second layer of analysis connects this movement with behavioral patterns based on the location—how many cars are speeding down a street, for instance, or how many people are going into a store….
Efforts to quantify city life with big data aren’t new, but where Placemeter’s clear advance is its ability to count pedestrians. Cities often track sidewalk traffic with little more than a hired hand and a manual clicker and spot locations. With its army of smartphone eyes, Placemeter promises a much wider net of real-time data dynamic enough to recognize not only that a person exists but also that person’s behavior, from walking speed to retail interest to general interaction with streets or public spaces…”