New book: Disasters and the Networked Economy


booksBook description: “Mainstream quantitative analysis and simulations are fraught with difficulties and are intrinsically unable to deal appropriately with long-term macroeconomic effects of disasters. In this new book, J.M. Albala-Bertrand develops the themes introduced in his past book, The Political Economy of Large Natural Disasters (Clarendon Press, 1993), to show that societal networking and disaster localization constitute part of an essential framework to understand disaster effects and responses.
The author’s last book argued that disasters were a problem of development, rather than a problem for development. This volume takes the argument forward both in terms of the macroeconomic effects of disaster and development policy, arguing that economy and society are not inert objects, but living organisms. Using a framework based on societal networking and the economic localization of disasters, the author shows that societal functionality (defined as the capacity of a system to survive, reproduce and develop) is unlikely to be impaired by natural disasters.”

Data-Driven Public Transport Planning


CaptureDavid Talbot in MIT Technology Review: “Researchers at IBM, using movement data collected from millions of cell-phone users in Ivory Coast in West Africa, have developed a new model for optimizing an urban transportation system….
While the results were preliminary, they point to the new ways that urban planners can use cell-phone data to design infrastructure, says Francesco Calabrese, a researcher at IBM’s research lab in Dublin, and a coauthor of a paper on the work. “This represents a new front with a potentially large impact on improving urban transportation systems,” he says. “People with cell phones can serve as sensors and be the building blocks of development efforts.”
The IBM work was done as part of a research challenge dubbed Data for Development, in which the telecom giant Orange released 2.5 billion call records from five million cell-phone users in Ivory Coast. The records were gathered between December 2011 and April 2012. The data release is the largest of its kind ever done. The records were cleaned to prevent anyone identifying the users, but they still include useful information about these users’ movements. The IBM paper is one of scores being aired later this week at a conference at MIT.”

Cybersecurity Issues in Social Media and Crowdsourcing


trustworthy_thumbWilson Center: ” The Commons Lab today released a new policy memo exploring the vulnerabilities facing the widespread use and acceptance of social media and crowdsourcing. This is the second publication in the project’s policy memo series.
Using real-world examples, security expert George Chamales describes the most-pressing cybersecurity vulnerabilities in this space and calls for the development of best practices to address these vulnerabilities, ultimately concluding that it is possible for institutions to develop trust in the emerging technologies. From the memo’s executive summary:
Individuals and organizations interested in using social media and crowdsourcing currently lack two key sets of information: a systematic assessment of the vulnerabilities in these technologies and a comprehensive set of best practices describing how to address those vulnerabilities. Identifying those vulnerabilities and developing those best practices are necessary to address a growing number of incidents ranging from innocent mistakes to targeted attacks that have claimed lives and cost millions of dollars.
Click here to read the full memo on Scribd.

"Imagery to the Crowd"


Description: “The Humanitarian Information Unit (HIU), a division within the Office of the Geographer and Global Issues at the U.S. Department of State, is working to increase the availability of spatial data in areas experiencing humanitarian emergencies. Built from a crowdsourcing model, the new “Imagery to the Crowd” process publishes high-resolution commercial satellite imagery, purchased by the Unites States Government, in a web-based format that can be easily mapped by volunteers.
The digital map data generated by the volunteers are stored in a database maintained by OpenStreetMap (OSM), a UK-registered non-profit foundation, under a license that ensures the data are freely available and open for a range of uses (http://osm.org). Inspired by the success of the OSM mapping effort after the 2010 Haiti earthquake, the Imagery to the Crowd process harnesses the combined power of satellite imagery and the volunteer mapping community to help aid agencies provide informed and effective humanitarian assistance, and plan recovery and development activities.
5-minute Ignite Talk about Imagery to the Crowd:

Toward an Ecological Model of Research and Development


Ben Schneiderman, the founding director of the Human-Computer Interaction Lab,  in The Atlantic: “The choice between basic and applied research is a false one….The belief that basic or pure research lays the foundation for applied research was fixed in science policy circles by Vannevar Bush’s 1945 report on Science: The Endless Frontier. Unfortunately, his unsubstantiated beliefs have remained attractive to powerful advocates of basic research who seek funding for projects that may or may not advance innovation and economic growth. Shifting the policy agenda to recognize that applied research goals often trigger more effective basic research could accelerate both applied and basic research….the highest payoffs often come when there is a healthy interaction of basic and applied research (Figure 3). This ecological model also suggests that basic and applied research are embedded in a rich context of large development projects and continuing efforts to refine production & operations.”
ecologicalmodelshneiderman.jpg

Open Data Research Announced


WWW Foundation Press Release:  “Speaking at an Open Government Partnership reception last night in London, Sir Tim Berners-Lee, founder of the World Wide Web Foundation (Web Foundation) and inventor of the Web, unveiled details of the first ever in-depth study into how the power of open data could be harnessed to tackle social challenges in the developing world. The 14 country study is funded by Canada’s International Development Research Centre (IDRC) and will be overseen by the Web Foundation’s world-leading open data experts. An interim progress update will be made at an October 2013 meeting of the Open Government Partnership, with in-depth results expected in 2014…

Sir Tim Berners-Lee, founder of the World Wide Web Foundation and inventor of the Web said:

“Open Data, accessed via a free and open Web, has the potential to create a better world. However, best practice in London or New York is not necessarily best practice in Lima or Nairobi.  The Web Foundation’s research will help to ensure that Open Data initiatives in the developing world will unlock real improvements in citizens’ day-to-day lives.”

José M. Alonso, program manager at the World Wide Web Foundation, added:

“Through this study, the Web Foundation hopes not only to contribute to global understanding of open data, but also to cultivate the ability of developing world researchers and development workers to understand and apply open data for themselves.”

Further details on the project, including case study outlines are available here: http://oddc.opendataresearch.org/

The Dangers of Surveillance


Paper by Neil M. Richards in Harvard Law Review. Abstract:  “From the Fourth Amendment to George Orwell’s Nineteen Eighty-Four, our culture is full of warnings about state scrutiny of our lives. These warnings are commonplace, but they are rarely very specific. Other than the vague threat of an Orwellian dystopia, as a society we don’t really know why surveillance is bad, and why we should be wary of it. To the extent the answer has something to do with “privacy,” we lack an understanding of what “privacy” means in this context, and why it matters. Developments in government and corporate practices have made this problem more urgent. Although we have laws that protect us against government surveillance, secret government programs cannot be challenged until they are discovered.
… I propose a set of four principles that should guide the future development of surveillance law, allowing for a more appropriate balance between the costs and benefits of government surveillance. First, we must recognize that surveillance transcends the public-private divide. Even if we are ultimately more concerned with government surveillance, any solution must grapple with the complex relationships between government and corporate watchers. Second, we must recognize that secret surveillance is illegitimate, and prohibit the creation of any domestic surveillance programs whose existence is secret. Third, we should recognize that total surveillance is illegitimate and reject the idea that it is acceptable for the government to record all Internet activity without authorization. Fourth, we must recognize that surveillance is harmful. Surveillance menaces intellectual privacy and increases the risk of blackmail, coercion, and discrimination; accordingly, we must recognize surveillance as a harm in constitutional standing doctrine.

Taking Open Government to the Next Level


Carl Fillichio who heads the Labor Department’s Office of Public Affairs at (Work in Progress):  “Since we published a department-wide API two years ago, developers across the country have used it to create apps that educate users about workplace safety and health, employers’ compliance with wage and hour laws, and improving employment opportunities for disabled workers, just to name a few!
Releasing data through an API was a big step forward, but it was not exactly groundbreaking.  However, since then, my team has been working hard to develop software development kits that are truly innovative because they make using our API even easier.
These kits (also known as SDKs) contain application code for six different platforms − iOS, Android, Blackberry, .Net, PHP and Ruby − that anyone creating a mobile or Web-based app using our data could incorporate. By using the kits, experienced developers will save time and novice developers will be able to work with DOL data in just a few minutes…. All of these kits can be downloaded from our developer site. Additionally, in keeping with the federal digital government strategy, each has been published as an open source project on github, a popular code-sharing site. For a list of federal APIs that are supported by our kits, check the github repository’s wiki page. This list will be updated as the kits are tested with additional federal APIs.”
 

Policy Modeling through Collaboration and Simulation


New paper on “Bridging narrative scenario texts and formal policy modeling through conceptual policy modeling” in Artificial Intelligence and Law.

Abstract: “Engaging stakeholders in policy making and supporting policy development with advanced information and communication technologies including policy simulation is currently high on the agenda of research. In order to involve stakeholders in providing their input to policy modeling via online means, simple techniques need to be employed such as scenario technique. Scenarios enable stakeholders to express their views in narrative text. At the other end of policy development, a frequently used approach to policy modeling is agent-based simulation. So far, effective support to transform narrative text input to formal simulation statements is not widely available. In this paper, we present a novel approach to support the transformation of narrative texts via conceptual modeling into formal simulation models. The approach also stores provenance information which is conveyed via annotations of texts to the conceptual model and further on to the simulation model. This way, traceability of information is provided, which contributes to better understanding and transparency, and therewith enables stakeholders and policy modelers to return to the sources that informed the conceptual and simulation model.”

New OECD paper on Machine-to-Machine Communications


Machine-to-Machine Communications – Connecting Billions of Devices: “This document examines the future of machine-to-machine communication (M2M), with a particular focus on mobile wireless networks. M2M devices are defined, in this paper, as those that are actively communicating using wired and wireless networks, are not computers in the traditional sense and are using the Internet in some form or another. While, at the global level, there are currently around five billion devices connected to mobile networks, this may by some estimates increase to 50 billion by the end of the decade. The report provides examples of some of the uses to which M2M is being put today and its potential to enhance economic and social development. It concludes that to achieve these benefits, however, changes to telecommunication policy and regulatory frameworks may be required. Some of the main areas that will need to be evaluated, and implications of M2M assessed, include: opening access to mobile wholesale markets for firms not providing public telecommunication services; numbering policy; frequency policy; privacy and security; and access to public sector information.”