Mobile phone data are a treasure-trove for development


Paul van der Boor and Amy Wesolowski in SciDevNet: “Each of us generates streams of digital information — a digital ‘exhaust trail’ that provides real-time information to guide decisions that affect our lives. For example, Google informs us about traffic by using both its ‘My Location’ feature on mobile phones and third-party databases to aggregate location data. BBVA, one of Spain’s largest banks, analyses transactions such as credit card payments as well as ATM withdrawals to find out when and where peak spending occurs.This type of data harvest is of great value. But, often, there is so much data that its owners lack the know-how to process it and fail to realise its potential value to policymakers.
Meanwhile, many countries, particularly in the developing world, have a dearth of information. In resource-poor nations, the public sector often lives in an analogue world where piles of paper impede operations and policymakers are hindered by uncertainty about their own strengths and capabilities.Nonetheless, mobile phones have quickly pervaded the lives of even the poorest: 75 per cent of the world’s 5.5 billion mobile subscriptions are in emerging markets. These people are also generating digital trails of anything from their movements to mobile phone top-up patterns. It may seem that putting this information to use would take vast analytical capacity. But using relatively simple methods, researchers can analyse existing mobile phone data, especially in poor countries, to improve decision-making.
Think of existing, available data as low-hanging fruit that we — two graduate students — could analyse in less than a month. This is not a test of data-scientist prowess, but more a way of saying that anyone could do it.
There are three areas that should be ‘low-hanging fruit’ in terms of their potential to dramatically improve decision-making in information-poor countries: coupling healthcare data with mobile phone data to predict disease outbreaks; using mobile phone money transactions and top-up data to assess economic growth; and predicting travel patterns after a natural disaster using historical movement patterns from mobile phone data to design robust response programmes.
Another possibility is using call-data records to analyse urban movement to identify traffic congestion points. Nationally, this can be used to prioritise infrastructure projects such as road expansion and bridge building.
The information that these analyses could provide would be lifesaving — not just informative or revenue-increasing, like much of this work currently performed in developed countries.
But some work of high social value is being done. For example, different teams of European and US researchers are trying to estimate the links between mobile phone use and regional economic development. They are using various techniques, such as merging night-time satellite imagery from NASA with mobile phone data to create behavioural fingerprints. They have found that this may be a cost-effective way to understand a country’s economic activity and, potentially, guide government spending.
Another example is given by researchers (including one of this article’s authors) who have analysed call-data records from subscribers in Kenya to understand malaria transmission within the country and design better strategies for its elimination. [1]
In this study, published in Science, the location data of the mobile phones of more than 14 million Kenyan subscribers was combined with national malaria prevalence data. After identifying the sources and sinks of malaria parasites and overlaying these with phone movements, analysis was used to identify likely transmission corridors. UK scientists later used similar methods to create different epidemic scenarios for the Côte d’Ivoire.”

Explore the world’s constitutions with a new online tool


Official Google Blog: “Constitutions are as unique as the people they govern, and have been around in one form or another for millennia. But did you know that every year approximately five new constitutions are written, and 20-30 are amended or revised? Or that Africa has the youngest set of constitutions, with 19 out of the 39 constitutions written globally since 2000 from the region?
The process of redesigning and drafting a new constitution can play a critical role in uniting a country, especially following periods of conflict and instability. In the past, it’s been difficult to access and compare existing constitutional documents and language—which is critical to drafters—because the texts are locked up in libraries or on the hard drives of constitutional experts. Although the process of drafting constitutions has evolved from chisels and stone tablets to pens and modern computers, there has been little innovation in how their content is sourced and referenced.
With this in mind, Google Ideas supported the Comparative Constitutions Project to build Constitute, a new site that digitizes and makes searchable the world’s constitutions. Constitute enables people to browse and search constitutions via curated and tagged topics, as well as by country and year. The Comparative Constitutions Project cataloged and tagged nearly 350 themes, so people can easily find and compare specific constitutional material. This ranges from the fairly general, such as “Citizenship” and “Foreign Policy,” to the very specific, such as “Suffrage and turnouts” and “Judicial Autonomy and Power.”
Our aim is to arm drafters with a better tool for constitution design and writing. We also hope citizens will use Constitute to learn more about their own constitutions, and those of countries around the world.”

Civics for a Digital Age


Jathan Sadowski  in the Atlantic on “Eleven principles for relating to cities that are automated and smart: Over half of the world’s population lives in urban environments, and that number is rapidly growing according to the World Health Organization. Many of us interact with the physical environments of cities on a daily basis: the arteries that move traffic, the grids that energize our lives, the buildings that prevent and direct actions. For many tech companies, though, much of this urban infrastructure is ripe for a digital injection. Cities have been “dumb” for millennia. It’s about time they get “smart” — or so the story goes….
Before accepting the techno-hype as a fait accompli, we should consider the implications such widespread technological changes might have on society, politics, and life in general. Urban scholar and historian Lewis Mumford warned of “megamachines” where people become mere components — like gears and transistors — in a hierarchical, human machine. The proliferation of smart projects requires an updated way of thinking about their possibilities, complications, and effects.
A new book, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, by Anthony Townsend, a research director at the Institute for the Future, provides some groundwork for understanding how these urban projects are occurring and what guiding principles we might use in directing their development. Townsend sets out to sketch a new understanding of “civics,” one that will account for new technologies.
The foundation for his theory speaks to common, worthwhile concerns: “Until now, smart-city visions have been controlling us. What we need is a new social code to bring meaning and to exert control over the technological code of urban operating systems.” It’s easy to feel like technologies — especially urban ones that are, at once, ubiquitous and often unseen to city-dwellers — have undue influence over our lives. Townsend’s civics, which is based on eleven principles, looks to address, prevent, and reverse that techno-power.”

Riding the Waves or Caught in the Tide? Navigating the Evolving Information Environment


IFLA Trend Report: “In the global information environment, time moves quickly and there’s an abundance of commentators trying to keep up. With each new technological development, a new report emerges assessing its impact on different sectors of society. The IFLA Trend Report takes a broader approach and identifies five high level trends shaping the information society, spanning access to education, privacy, civic engagement and transformation. Its findings reflect a year’s consultation with a range of experts and stakeholders from different disciplines to map broader societal changes occurring, or likely to occur in the information environment.
The IFLA Trend Report is more than a single document – it is a selection of resources to help you understand where libraries fit into a changing society.
From Five Key Trends Which Will Change Our Information Environment:
Trend 1:
New Technologies Will Both Expand and Limit Who Has Access to Information…
Trend 2:
Online Education Will Democratise and Disrupt Global Learning…
Trend 3:
The Boundaries of Privacy and Data Protection Will Be Redefined…
Trend 4:
Hyper-Connected Societies Will Listen to and Empower New Voices and Groups…In hyper-connected societies more opportunities for collective action are being realised – enabling the rise of new voices and promoting the growth of single-issue movements at the expense of traditional political parties. Open government initiatives and access to public sector data are leading to more transparency and citizen-focused public services.
Trend 5:
The Global Information Economy Will Be Transformed by New Technologies…”

From Networked Publics to Issue Publics: Reconsidering the Public/Private Distinction in Web Science


New paper by Andreas Birkbak: “As an increasing part of everyday life becomes connected with the web in many areas of the globe, the question of how the web mediates political processes becomes still more urgent. Several scholars have started to address this question by thinking about the web in terms of a public space. In this paper, we aim to make a twofold contribution towards the development of the concept of publics in web science. First, we propose that although the notion of publics raises a variety of issues, two major concerns continue to be user privacy and democratic citizenship on the web. Well-known arguments hold that the complex connectivity of the web puts user privacy at risk and enables the enclosure of public debate in virtual echo chambers. Our first argument is that these concerns are united by a set of assumptions coming from liberal political philosophy that are rarely made explicit. As a second contribution, this paper points towards an alternative way to think about publics by proposing a pragmatist reorientation of the public/private distinction in web science, away from seeing two spheres that needs to be kept separate, towards seeing the public and the private as something that is continuously connected. The theoretical argument is illustrated by reference to a recently published case study of Facebook groups, and future research agendas for the study of web-mediated publics are proposed.”

Confronting Wicked Problems in the Metropolis


An APSA 2013 Annual Meeting Paper by Jered Carr and Brent Never: “These problems facing many metropolitan regions in the U.S. are complex, open-ended and seemingly intractable. The obstacles to regional governance created by these “wicked” problems are the root of the criticisms of the consensus-based “self-organizing” strategies described by frameworks such as New Regionalism and Institutional Collective Action. The self-organized solutions described by these frameworks require substantial consensus exist among the participants and this creates a bias toward solving low-conflict problems where consensus already exists. We discuss the limitations of these two influential research programs in the context of wicked problems and draw on the concept of nested institutional action situations to suggest a research agenda for studying intergovernmental collaboration on problems requiring the development of consensus about the nature of the problem and acceptable solutions. The Advocacy Coalitions and Institutional Analysis and Development frameworks have been effectively used to explain regional collaboration on wicked environmental problems and likely have insights for confronting the wicked fiscal and social problems of regional governance. The implications are that wicked problems are tamed through iterated games and that institution-making at the collective-choice level can then be scaled up to achieve agreement at the constitutional level of analysis.”

Project Anticipation


New site for the UNESCO Chair in Anticipatory Systems: “The purpose of the Chair in Anticipatory Systems is to both develop and promote the Discipline of Anticipation, thereby bringing a critical idea to life. To this end, we have a two pronged strategy consisting of knowledge development and communication. The two are equally important. While many academic projects naturally emphasize knowledge development, we must also reach a large and disparate audience, and open minds locked within the longstanding legacy of reactive science. Thus, from a practical standpoint, how we conceptualize and communicate the Discipline of Anticipation is as important as the Discipline of Anticipation itself….
The project’s main objective is the development of the Discipline of Anticipation, including the development of a system of anticipatory strategies and techniques. The more the culture of anticipation spreads, the easier it will be to develop socially acceptable anticipatory strategies. It will then be possible to accumulate relevant experience on how to think about the future and to use anticipatory methods. It will also be possible to try and develop a language and a body of practices that are more adapted for thinking about the future and for developing new ways to address threads and opportunities.
The following outcomes are envisaged:

  • Futures Literacy: Development of a set of protocols for the appropriate implementation on the ground of the different kinds of anticipation (under the rubric of futures literacy), together with syllabi and teaching materials on the Discipline of Anticipation.
  • Anticipatory Capability Profile: Development of a Anticipatory Capability Profile for communities and institutions, together with a set of recommendations on how a community, organization or institution may raise its anticipatory performance.
  • Resilience Profile: Setting of a resilience index and analysis of the resilience level of selected communities and regions, including a set of recommendations on how to raise their resilience level.”

New! Humanitarian Computing Library


Patrick Meier at iRevolution: “The field of “Humanitarian Computing” applies Human Computing and Machine Computing to address major information-based challengers in the humanitarian space. Human Computing refers to crowdsourcing and microtasking, which is also referred to as crowd computing. In contrast, Machine Computing draws on natural language processing and machine learning, amongst other disciplines. The Next Generation Humanitarian Technologies we are prototyping at QCRI are powered by Humanitarian Computing research and development (R&D).
My QCRI colleagues and I  just launched the first ever Humanitarian Computing Library which is publicly available here. The purpose of this library, or wiki, is to consolidate existing and future research that relate to Humanitarian Computing in order to support the development of next generation humanitarian tech. The repository currently holds over 500 publications that span topics such as Crisis Management, Trust and Security, Software and Tools, Geographical Analysis and Crowdsourcing. These publications are largely drawn from (but not limited to) peer-reviewed papers submitted at leading conferences around the world. We invite you to add your own research on humanitarian computing to this growing collection of resources.”

Linux Foundation Collaboration Gets Biological


eWeek: “The Linux Foundation is growing its roster of collaboration projects by expanding from the physical into the biological realm with the OpenBEL (Biological Expression Language). The Linux Foundation, best known as the organization that helps bring Linux vendors and developers together, is also growing its expertise as a facilitator for collaborative development projects…
OpenBEL got its start in June 2012 after being open-sourced by biotech firm Selventa. The effort now includes the participation of Foundation Medicine, AstraZeneca,The Fraunhofer Institute, Harvard Medical School, Novartis, Pfizer and the University of California at San Diego.
BEL offers researchers a language to clearly express scientific findings from the life sciences in a format that can be understood by computing infrastructure…..
The Linux Foundation currently hosts a number of different collaboration projects, including the Xen virtualization project, the OpenDaylight software-defined networking effort, Tizen for mobile phone development, and OpenMAMA for financial services information, among others.
The OpenBEL project will be similar to existing collaboration projects in that the contributors to the project want to accelerate their work through collaborative development, McPherson explained.”

Government Is a Good Venture Capitalist


Wall Street Journal: “In a knowledge-intensive economy, innovation drives growth. But what drives innovation? In the U.S., most conservatives believe that economically significant new ideas originate in the private sector, through either the research-and-development investments of large firms with deep pockets or the inspiration of obsessive inventors haunting shabby garages. In this view, the role of government is to secure the basic conditions for honest and efficient commerce—and then get out of the way. Anything more is bound to be “wasteful” and “burdensome.”
The real story is more complex and surprising. For more than four decades, R&D magazine has recognized the top innovations—100 each year—that have moved past the conceptual stage into commercial production and sales. Economic sociologists Fred Block and Matthew Keller decided to ask a simple question: Where did these award-winning innovations come from?
The data indicated seven kinds of originating entities: Fortune 500 companies; small and medium enterprises (including startups); collaborations among private entities; government laboratories; universities; spinoffs started by researchers at government labs or universities; and a grab bag of other public and nonprofit agencies.
Messrs. Block and Keller randomly selected three years in each of the past four decades and analyzed the resulting 1,200 innovations. About 10% originated in foreign entities; the sociologists focused on the domestic innovations, more than 1,050.
Two of their findings stand out. First, the number of award winners originating in Fortune 500 companies—either working alone or in collaboration with others—has declined steadily and sharply, from an annual average of 44 in the 1970s to only nine in the first decade of this century.
Second, the number of top innovations originating in federal laboratories, universities or firms formed by former researchers in those entities rose dramatically, from 18 in the 1970s to 37 in the 1980s and 55 in the 1990s before falling slightly to 49 in the 2000s. Without the research conducted in federal labs and universities (much of it federally funded), commercial innovation would have been far less robust…”