Waze and the Traffic Panopticon


 in the New Yorker: “In April, during his second annual State of the City address, Los Angeles Mayor Eric Garcetti announced a data-sharing agreement with Waze, the Google-owned, Israel-based navigation service. Waze is different from most navigation apps, including Google Maps, in that it relies heavily on real-time, user-generated data. Some of this data is produced actively—a driver or passenger sees a stalled vehicle, then uses a voice command or taps a stalled-vehicle icon on the app to alert others—while other data, such as the user’s location and average speed, is gathered passively, via smartphones. The agreement will see the city provide Waze with some of the active data it collects, alerting drivers to road closures, construction, and parades, among other things. From Waze, the city will get real-time data on traffic and road conditions. Garcetti said that the partnership would mean “less congestion, better routing, and a more livable L.A.” Di-Ann Eisnor, Waze’s head of growth, acknowledged to me that these kinds of deals can cause discomfort to the people working inside city government. “It’s exciting, but people inside are also fearful because it seems like too much work, or it seems so unknown,” she said.

Indeed, the deal promises to help the city improve some of its traffic and infrastructure systems (L.A. still uses paper to manage pothole patching, for example), but it also acknowledges Waze’s role in the complex new reality of urban traffic planning. Traditionally, traffic management has been a largely top-down process. In Los Angeles, it is coördinated in a bunker downtown, several stories below the sidewalk, where engineers stare at blinking lights representing traffic and live camera feeds of street intersections. L.A.’s sensor-and-algorithm-driven Automated Traffic Surveillance and Control System is already one of the world’s most sophisticated traffic-mitigation tools, but it can only do so much to manage the city’s eternally unsophisticated gridlock. Los Angeles appears to see its partnership with Waze as an important step toward improving the bridge between its subterranean panopticon and the rest of the city still further, much like other metropolises that have struck deals with Waze under the company’s Connected Cities program.
Among the early adopters is Rio de Janeiro, whose urban command center tracks everything from accidents to hyperlocal weather conditions, pulling data from thirty departments and private companies, including Waze. “In Rio,” Eisnor said, traffic managers “were able to change the garbage routes, figure out where to install cameras, and deploy traffic personnel” because of the program. She also pointed out that Connected Cities has helped municipal workers in Washington, D.C., patch potholes within forty-eight hours of their being identified on Waze. “We’re helping reframe city planning through not just space but space and time,” she said…..(More)

Safecity: Combatting Sexual Violence Through Technology


Safecity, …. is a not for profit organization that provides a platform for people to share their personal stories of sexual harassment and abuse in public spaces. This data, which may be anonymous, gets aggregated as hot spots on a map indicating trends at a local level. The idea is to make this data useful for individuals, local communities and local administration for social and systemic change for safer cities. We launched on 26 Dec 2012 and since then have collected over 4000 stories from over 50 cities in India and Nepal.

How can Safecity help?
Safecity is a crowd map that converts these individual stories into data that is then plotted on a map. It is then easier to see trends at the location level (e.g. a street). The focus is taken away from the individual victim and instead we can focus on solving the problem at the local neighborhood level.

The Objectives:
• Create awareness on street harassment and abuse and get people, especially women, victims of hate and LGBTQ crimes to break their silence and report their personal experiences.
• Collate this information to showcase location based trends.
• Make this information available and useful for individuals, local communities and local administration to solve the problem at the local level through urban planning aimed at addressing infrastructural deficits
• Establish successful models of community engagement using crowd sourced data to solve civic and local issues.
• Reach out to women who do not have equal access to technology through our Missed dial facility for them to report any cases of abuse and harassment.

We wish to take this data forward to lobby for systemic change in terms of urban planning and infrastructure, reforms in our law that are premised on gender equity, and social changes to loosen the shackles that do not allow us otherwise to live the way we want to, with the freedom we want to, and with the rights that are fundamental to all of us, and it will just build our momentum further by having as many passionate, concerned and diverse genders on board.

We are trying to build a movement by collecting these reports through campaigns, workshops and awareness programs with schools, colleges, local communities and partners with shared vision. Crime against women has been rampant and largely remains unreported even till date. That silence needs to gain a voice and the time is now. We are determined to highlight this serious social issue and we believe we are taking a step towards changing the way our society thinks and reacts and are hopeful that so are you. In time we hope it will lead to a safe and non-violent environment for all.

Safecity uses technology to document sexual harassment and abuse in public spaces in the following way. People can report incidents of sexual abuse and street harassment, that they have experienced or witnessed. They can share solutions that can help avoid such situations and decide for themselves what works best for them, their geographic location or circumstances.

By allowing people to pin such incidents on a crowd-sourced map, we aim to let them highlight the “hotspots” of such activities. This accentuates the emerging trend in a particular area, enabling the citizens to acknowledge the problem, take personal precautions and devise a solution at the neighbourhood level.

Safecity believes in uniting millions of voices that can become a catalyst for change.

You can read the FAQs section for more information on how the data is used for public good. (More)”

India wants all government organizations to develop open APIs


Medianama: “The department of electronics and information technology (DeitY) is looking to frame a policy (pdf) for adopting and developing open application programming interfaces (APIs) in government organizations to promote software interoperability for all e-governance applications & systems. The policy shall be applicable to all central government organizations and to those state governments that choose to adopt the policy.

DeitY also said that all information and data of a government organisation shall be made available by open APIs, as per the National Data Sharing and Accessibility Policy and adhere to National Cyber Security Policy.

Policy points

– Each published API of a Government organization shall be provided free of charge whenever possible to other government organizations and public.

– Each published API shall be properly documented with sample code and sufficient information for developers to make use of the API.

– The life-cycle of the open API shall be made available by the API publishing Government organisation. The API shall be backward compatible with at least two earlier versions.

– Government organizations may use an authentication mechanism to enable service interoperability and single sign-on.

– All Open API systems built and data provided shall adhere to GoI security policies and guidelines.

…. This would allow anyone to build a website or an application and pull government information into the public domain. Everyone knows navigating a government website can be nightmarish. For example, Indian Railways provides open APIs which enabled the development of applications such as RailYatri. Through the eRail APIs, the application pulls info which includes list of stations, trains between stations, route of a train, Train Fares, PNR Status, Live train status, seat availability, cancelled, rescheduled or diverted train information and current running status of the train. …(More)”

See also “Policy on Open Application Programming Interfaces (APIs) for Government of India

Shedding light on government, one dataset at a time


Bill Below of the OECD Directorate for Public Governance and Territorial Development at OECD Insights: “…As part of its Open Government Data (OGD) work, the OECD has created OURdata, an index that assesses governments’ efforts to implement OGD in three critical areas: Openness, Usefulness and Re-usability. The results are promising. Those countries that began the process in earnest some five years ago, today rank very high on the scale. According to this Index, which closely follows the principles of the G8 Open Data Charter, Korea is leading the implementation of OGD initiatives with France a close second.

ourdata

Those who have started the process but who are lagging (such as Poland) can draw on the experience of other OECD countries, and benefit from a clear roadmap to guide them.

Indeed, bringing one’s own country’s weaknesses out into the light is the first, and sometimes most courageous, step towards achieving the benefits of OGD. Poland has just completed its Open Government Data country review with the OECD revealing some sizable challenges ahead in transforming the internal culture of its institutions. For the moment, a supply-side rather than people-driven approach to data release is prevalent. Also, OGD in Poland is not widely understood to be a source of value creation and growth….(More)”

Remote Voting and Beyond: How Tech Will Transform Government From the Inside Out


Springwise: “…Technology, and in particular the internet, are often seen as potential stumbling blocks for government. But this perception acts as a brake on innovation in public services and in politics more generally. By embracing technology, rather than warily containing it, governments globally could benefit hugely. In terms of formulating and executing policy, technology can help governments become more transparent, accountable and effective, while improving engagement and participation from regular citizens.

On engagement, for instance, technology is opening up new avenues which make taking part in the political process far more straightforward. Springwise-featured Harvard startup Voatz are building a platform that allows users to vote, make campaign donations and complete opinion polls from their smartphones. The app, which uses biometric authentication to ensure that identities are comprehensively verified, could well entice younger voters who are alienated by the ballot box. Melding the simplicity of apps with sophisticated identity verification technology, Voatz is just one example of how tech can disrupt government for good.

From the Ground Up…

The potential for active participation goes far beyond voting. E-focus groups, online petitions and campaign groups have the power to transform the interaction between political establishments and citizens. From fact-checking charities enabled by crowdfunding such as UK-based Full Fact to massive national campaigns conducted online, citizens connected by technology are using their collective power to reshape government in democratic countries. Under other regimes, such as in the People’s Republic of China, vigilante citizens are circumventing extensive firewalls to shine a light on official misconduct.

…and the Top Down

As well as an abundance of citizen-led efforts to improve governance, there are significant moves from governments themselves to shake-up public service delivery. Even HealthCare.gov, flawed though the roll-out was, marks a hugely ambitious piece of government reform underpinned by technology. Indeed, Obama has shown an unprecedented willingness to embrace technology in his two terms, appointing chief information and technology officers, promising to open up government data and launching the @POTUS Twitter account last month. Clearly, recognition is there from governments that technology can be a game changer for their headline policies.

While many countries are using technology for individual projects, there is one government that is banking its entire national success on tech – Estonia. The tiny, sparsely populated country in Eastern Europe is one of the most technologically advanced in the world. Everything from citizen IDs to tax returns and health records make use of technology and are efficient and ‘future-proofed’ as a result.

Whether as a threat or an opportunity, technology represents a transformative influence on government. Its potential as a disruptive, reshaping force has fed a narrative that casts technology as a looming threat and a destabiliser of conventional power structures. But harnessed properly and executed effectively, technology can remold government for the better, improving big public service projects, raising participation and engaging a young population whose default is digital….(More)”

Selected Readings on Data Governance


Jos Berens (Centre for Innovation, Leiden University) and Stefaan G. Verhulst (GovLab)

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data governance was originally published in 2015.

Context
The field of Data Collaboratives is premised on the idea that sharing and opening-up private sector datasets has great – and yet untapped – potential for promoting social good. At the same time, the potential of data collaboratives depends on the level of societal trust in the exchange, analysis and use of the data exchanged. Strong data governance frameworks are essential to ensure responsible data use. Without such governance regimes, the emergent data ecosystem will be hampered and the (perceived) risks will dominate the (perceived) benefits. Further, without adopting a human-centered approach to the design of data governance frameworks, including iterative prototyping and careful consideration of the experience, the responses may fail to be flexible and targeted to real needs.

Selected Readings List (in alphabetical order)

Annotated Selected Readings List (in alphabetical order)

Better Place Lab, “Privacy, Transparency and Trust.” Mozilla, 2015. Available from: http://www.betterplace-lab.org/privacy-report.

  • This report looks specifically at the risks involved in the social sector having access to datasets, and the main risks development organizations should focus on to develop a responsible data use practice.
  • Focusing on five specific countries (Brazil, China, Germany, India and Indonesia), the report displays specific country profiles, followed by a comparative analysis centering around the topics of privacy, transparency, online behavior and trust.
  • Some of the key findings mentioned are:
    • A general concern on the importance of privacy, with cultural differences influencing conception of what privacy is.
    • Cultural differences determining how transparency is perceived, and how much value is attached to achieving it.
    • To build trust, individuals need to feel a personal connection or get a personal recommendation – it is hard to build trust regarding automated processes.

Montjoye, Yves Alexandre de; Kendall, Jake and; Kerry, Cameron F. “Enabling Humanitarian Use of Mobile Phone Data.” The Brookings Institution, 2015. Available from: http://www.brookings.edu/research/papers/2014/11/12-enabling-humanitarian-use-mobile-phone-data.

  • Focussing in particular on mobile phone data, this paper explores ways of mitigating privacy harms involved in using call detail records for social good.
  • Key takeaways are the following recommendations for using data for social good:
    • Engaging companies, NGOs, researchers, privacy experts, and governments to agree on a set of best practices for new privacy-conscientious metadata sharing models.
    • Accepting that no framework for maximizing data for the public good will offer perfect protection for privacy, but there must be a balanced application of privacy concerns against the potential for social good.
    • Establishing systems and processes for recognizing trusted third-parties and systems to manage datasets, enable detailed audits, and control the use of data so as to combat the potential for data abuse and re-identification of anonymous data.
    • Simplifying the process among developing governments in regards to the collection and use of mobile phone metadata data for research and public good purposes.

Centre for Democracy and Technology, “Health Big Data in the Commercial Context.” Centre for Democracy and Technology, 2015. Available from: https://cdt.org/insight/health-big-data-in-the-commercial-context/.

  • Focusing particularly on the privacy issues related to using data generated by individuals, this paper explores the overlap in privacy questions this field has with other data uses.
  • The authors note that although the Health Insurance Portability and Accountability Act (HIPAA) has proven a successful approach in ensuring accountability for health data, most of these standards do not apply to developers of the new technologies used to collect these new data sets.
  • For non-HIPAA covered, customer facing technologies, the paper bases an alternative framework for consideration of privacy issues. The framework is based on the Fair Information Practice Principles, and three rounds of stakeholder consultations.

Center for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice.” Centre for Information Policy Leadership, Hunton & Williams LLP, 2015. Available from: https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.

  • This white paper is part of a project aiming to explain what is often referred to as a new, risk-based approach to privacy, and the development of a privacy risk framework and methodology.
  • With the pace of technological progress often outstripping the capabilities of privacy officers to keep up, this method aims to offer the ability to approach privacy matters in a structured way, assessing privacy implications from the perspective of possible negative impact on individuals.
  • With the intended outcomes of the project being “materials to help policy-makers and legislators to identify desired outcomes and shape rules for the future which are more effective and less burdensome”, insights from this paper might also feed into the development of innovative governance mechanisms aimed specifically at preventing individual harm.

Centre for Information Policy Leadership, “Data Governance for the Evolving Digital Market Place”, Centre for Information Policy Leadership, Hunton & Williams LLP, 2011. Available from: http://www.huntonfiles.com/files/webupload/CIPL_Centre_Accountability_Data_Governance_Paper_2011.pdf.

  • This paper argues that as a result of the proliferation of large scale data analytics, new models governing data inferred from society will shift responsibility to the side of organizations deriving and creating value from that data.
  • It is noted that, with the reality of the challenge corporations face of enabling agile and innovative data use “In exchange for increased corporate responsibility, accountability [and the governance models it mandates, ed.] allows for more flexible use of data.”
  • Proposed as a means to shift responsibility to the side of data-users, the accountability principle has been researched by a worldwide group of policymakers. Tailing the history of the accountability principle, the paper argues that it “(…) requires that companies implement programs that foster compliance with data protection principles, and be able to describe how those programs provide the required protections for individuals.”
  • The following essential elements of accountability are listed:
    • Organisation commitment to accountability and adoption of internal policies consistent with external criteria
    • Mechanisms to put privacy policies into effect, including tools, training and education
    • Systems for internal, ongoing oversight and assurance reviews and external verification
    • Transparency and mechanisms for individual participation
    • Means of remediation and external enforcement

Crawford, Kate; Schulz, Jason. “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harm.” NYU School of Law, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2325784&download=yes.

  • Considering the privacy implications of large-scale analysis of numerous data sources, this paper proposes the implementation of a ‘procedural data due process’ mechanism to arm data subjects against potential privacy intrusions.
  • The authors acknowledge that some privacy protection structures already know similar mechanisms. However, due to the “inherent analytical assumptions and methodological biases” of big data systems, the authors argue for a more rigorous framework.

Letouze, Emmanuel, and; Vinck, Patrick. “The Ethics and Politics of Call Data Analytics”, DataPop Alliance, 2015. Available from: http://static1.squarespace.com/static/531a2b4be4b009ca7e474c05/t/54b97f82e4b0ff9569874fe9/1421442946517/WhitePaperCDRsEthicFrameworkDec10-2014Draft-2.pdf.

  • Focusing on the use of Call Detail Records (CDRs) for social good in development contexts, this whitepaper explores both the potential of these datasets – in part by detailing recent successful efforts in the space – and political and ethical constraints to their use.
  • Drawing from the Menlo Report Ethical Principles Guiding ICT Research, the paper explores how these principles might be unpacked to inform an ethics framework for the analysis of CDRs.

Data for Development External Ethics Panel, “Report of the External Ethics Review Panel.” Orange, 2015. Available from: http://www.d4d.orange.com/fr/content/download/43823/426571/version/2/file/D4D_Challenge_DEEP_Report_IBE.pdf.

  • This report presents the findings of the external expert panel overseeing the Orange Data for Development Challenge.
  • Several types of issues faced by the panel are described, along with the various ways in which the panel dealt with those issues.

Federal Trade Commission Staff Report, “Mobile Privacy Disclosures: Building Trust Through Transparency.” Federal Trade Commission, 2013. Available from: www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf.

  • This report looks at ways to address privacy concerns regarding mobile phone data use. Specific advise is provided for the following actors:
    • Platforms, or operating systems providers
    • App developers
    • Advertising networks and other third parties
    • App developer trade associations, along with academics, usability experts and privacy researchers

Mirani, Leo. “How to use mobile phone data for good without invading anyone’s privacy.” Quartz, 2015. Available from: http://qz.com/398257/how-to-use-mobile-phone-data-for-good-without-invading-anyones-privacy/.

  • This paper considers the privacy implications of using call detail records for social good, and ways to mitigate risks of privacy intrusion.
  • Taking example of the Orange D4D challenge and the anonymization strategy that was employed there, the paper describes how classic ‘anonymization’ is often not enough. The paper then lists further measures that can be taken to ensure adequate privacy protection.

Bernholz, Lucy. “Several Examples of Digital Ethics and Proposed Practices” Stanford Ethics of Data conference, 2014, Available from: http://www.scribd.com/doc/237527226/Several-Examples-of-Digital-Ethics-and-Proposed-Practices.

  • This list of readings prepared for Stanford’s Ethics of Data conference lists some of the leading available literature regarding ethical data use.

Abrams, Martin. “A Unified Ethical Frame for Big Data Analysis.” The Information Accountability Foundation, 2014. Available from: http://www.privacyconference2014.org/media/17388/Plenary5-Martin-Abrams-Ethics-Fundamental-Rights-and-BigData.pdf.

  • Going beyond privacy, this paper discusses the following elements as central to developing a broad framework for data analysis:
    • Beneficial
    • Progressive
    • Sustainable
    • Respectful
    • Fair

Lane, Julia; Stodden, Victoria; Bender, Stefan, and; Nissenbaum, Helen, “Privacy, Big Data and the Public Good”, Cambridge University Press, 2014. Available from: http://www.dataprivacybook.org.

  • This book treats the privacy issues surrounding the use of big data for promoting the public good.
  • The questions being asked include the following:
    • What are the ethical and legal requirements for scientists and government officials seeking to serve the public good without harming individual citizens?
    • What are the rules of engagement?
    • What are the best ways to provide access while protecting confidentiality?
    • Are there reasonable mechanisms to compensate citizens for privacy loss?

Richards, Neil M, and; King, Jonathan H. “Big Data Ethics”. Wake Forest Law Review, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2384174.

  • This paper describes the growing impact of big data analytics on society, and argues that because of this impact, a set of ethical principles to guide data use is called for.
  • The four proposed themes are: privacy, confidentiality, transparency and identity.
  • Finally, the paper discusses how big data can be integrated into society, going into multiple facets of this integration, including the law, roles of institutions and ethical principles.

OECD, “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data”. Available from: http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.

  • A globally used set of principles to inform thought about handling personal data, the OECD privacy guidelines serve as one the leading standards for informing privacy policies and data governance structures.
  • The basic principles of national application are the following:
    • Collection Limitation Principle
    • Data Quality Principle
    • Purpose Specification Principle
    • Use Limitation Principle
    • Security Safeguards Principle
    • Openness Principle
    • Individual Participation Principle
    • Accountability Principle

The White House Big Data and Privacy Working Group, “Big Data: Seizing Opportunities, Preserving Values”, White House, 2015. Available from: https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf.

  • Documenting the findings of the White House big data and privacy working group, this report lists i.a. the following key recommendations regarding data governance:
    • Bringing greater transparency to the data services industry
    • Stimulating international conversation on big data, with multiple stakeholders
    • With regard to educational data: ensuring data is used for the purpose it is collected for
    • Paying attention to the potential for big data to facilitate discrimination, and expanding technical understanding to stop discrimination

William Hoffman, “Pathways for Progress” World Economic Forum, 2015. Available from: http://www3.weforum.org/docs/WEFUSA_DataDrivenDevelopment_Report2015.pdf.

  • This paper treats i.a. the lack of well-defined and balanced governance mechanisms as one of the key obstacles preventing particularly corporate sector data from being shared in a controlled space.
  • An approach that balances the benefits against the risks of large scale data usage in a development context, building trust among all stake holders in the data ecosystem, is viewed as key.
  • Furthermore, this whitepaper notes that new governance models are required not just by the growing amount of data and analytical capacity, and more refined methods for analysis. The current “super-structure” of information flows between institutions is also seen as one of the key reasons to develop alternatives to the current – outdated – approaches to data governance.

India asks its citizens: please digitise our files


Joshua Chambers in FutureGov: “India has asked its citizens to help digitise records so that it can move away from paper processes.

Using its crowdsourcing web site MyGov, the government wrote that “we cannot talk of Digital India and transforming India into a knowledge society if most of the transactions continue to be physical.”

It is “essential” that paper records are converted into machine readable digital versions, the government added, but “the cost of such digitisation is very large and existing budgetary constraints of government and many other organisations do not allow such lavish digitisation effort.”

Consequently, the government is asking citizens for advice on how to build a cheap content management system and tools that will allow it to crowdsource records transcriptions. Citizens would be rewarded for every word that they transcribe through a points system, which can then be recouped into cash prizes.

“The proposed platform will create earning and income generation opportunities for our literate rural and urban citizens, develop digital literacy and IT skills and include them in the making of Digital India,” the government added.

The announcement also noted the importance of privacy, suggesting that documents are split so that no portion gives any clue regarded the overall content of the document.

Instead, two people will be given the same words to transcribe, and the software will compare their statements to ensure accuracy. Only successful transcription will be rewarded with points….(More)”

Nepal Aid Workers Helped by Drones, Crowdsourcing


Shirley Wang et al in the Wall Street Journal: “….It is too early to gauge the exact impact of the technology in Nepal relief efforts, which have just begun amid chaos on the ground. Aid organizations have reported hospitals are overstretched, a shortage of capacity at Katmandu’s airport is crippling aid distribution and damaged roads and the mountainous country’s difficult terrain make reaching villages difficult.

Still, technology is playing an increasing role in the global response to humanitarian crises. Within hours of Saturday’s 7.8-magnitude temblor, U.S. giants such as Google Inc. and Facebook Inc. were offering their networks for use in verifying survivors and helping worried friends and relatives locate their loved ones.

Advances in online mapping—long used to calculate distances and plot driving routes—and the ability of camera-equipped drones are playing an increasingly important role in coordinating emergency responses at ground zero of any disaster.

A community of nonprofit groups uses satellite images, private images and open-source mapping technology to remap areas affected by the earthquake. They mark damaged buildings and roads so rescuers can identify the worst-hit areas and assess how accessible different areas are. The technology complements more traditional intelligence from aircraft.

Such crowdsourced real-time mapping technologies were first used in the 2010 Haiti earthquake, according to Chris Grundy, a professor in Geographical Information Systems at the London School of Hygiene and Tropical Medicine. The technology “has been advancing a little bit every time [every situation where it is used] as we start to see what works,” said Prof. Grundy.

The American Red Cross supplied its relief team on the Wednesday night flight to Nepal from Washington, D.C. with 50 digital maps and an inch-thick pile of paper maps that help identify where the needs are. The charity has a mapping project with the British Red Cross, Doctors Without Borders and the Humanitarian OpenStreetMap Team, a crowdsourced data-sharing group.

Almost a week after the Nepal earthquake, two more people have been pulled from the rubble in Katmandu by teams of international rescuers. But hope for finding more survivors is waning. Photo: Sean McLain/The Wall Street Journal.

Mapping efforts have grown substantially since Haiti, according to Dale Kunce, head of the geographic information systems team at the American Red Cross. In the two months after the Haiti temblor, 600 mapping contributors made 1.5 million edits, while in the first 48 hours after the Nepal earthquake, 2,000 mappers had already made three million edits, Mr. Kunce said.

Some 3,400 volunteers from around the world are now inspecting images of Nepal online to identify road networks and conditions, to assess the extent of damage and pinpoint open spaces where displaced persons tend to congregate, according to Nama Budhathoki, executive director of a nonprofit technology company called Katmandu Living Labs.

His group is operating from a cramped but largely undamaged meeting room in a central-Katmandu office building to help coordinate the global effort of various mapping organizations with the needs of agencies like Doctors Without Borders and the international Red Cross community.

In recent days the Nepal Red Cross and Nepalese army have requested and been supplied with updated maps of severely damaged districts, said Dr. Budhathoki….(More)”

Five Headlines from a Big Month for the Data Revolution


Sarah T. Lucas at Post2015.org: “If the history of the data revolution were written today, it would include three major dates. May 2013, when theHigh Level Panel on the Post-2015 Development Agenda first coined the phrase “data revolution.” November 2014, when the UN Secretary-General’s Independent Expert Advisory Group (IEAG) set a vision for it. And April 2015, when five headliner stories pushed the data revolution from great idea to a concrete roadmap for action.

The April 2015 Data Revolution Headlines

1. The African Data Consensus puts Africa in the lead on bringing the data revolution to the regional level. TheAfrica Data Consensus (ADC) envisions “a profound shift in the way that data is harnessed to impact on development decision-making, with a particular emphasis on building a culture of usage.” The ADC finds consensus across 15 “data communities”—ranging from open data to official statistics to geospatial data, and is endorsed by Africa’s ministers of finance. The ADC gets top billing in my book, as the first contribution that truly reflects a large diversity of voices and creates a political hook for action. (Stay tuned for a blog from my colleague Rachel Quint on the ADC).

2. The Sustainable Development Solutions Network (SDSN) gets our minds (and wallets) around the data needed to measure the SDGs. The SDSN Needs Assessment for SDG Monitoring and Statistical Capacity Development maps the investments needed to improve official statistics. My favorite parts are the clear typology of data (see pg. 12), and that the authors are very open about the methods, assumptions, and leaps of faith they had to take in the costing exercise. They also start an important discussion about how advances in information and communications technology, satellite imagery, and other new technologies have the potential to expand coverage, increase analytic capacity, and reduce the cost of data systems.

3. The Overseas Development Institute (ODI) calls on us to find the “missing millions.” ODI’s The Data Revolution: Finding the Missing Millions presents the stark reality of data gaps and what they mean for understanding and addressing development challenges. The authors highlight that even that most fundamental of measures—of poverty levels—could be understated by as much as a quarter. And that’s just the beginning. The report also pushes us to think beyond the costs of data, and focus on how much good data can save. With examples of data lowering the cost of doing government business, the authors remind us to think about data as an investment with real economic and social returns.

4. Paris21 offers a roadmap for putting national statistic offices (NSOs) at the heart of the data revolution.Paris21’s Roadmap for a Country-Led Data Revolution does not mince words. It calls on the data revolution to “turn a vicious cycle of [NSO] underperformance and inadequate resources into a virtuous one where increased demand leads to improved performance and an increase in resources and capacity.” It makes the case for why NSOs are central and need more support, while also pushing them to modernize, innovate, and open up. The roadmap gets my vote for best design. This ain’t your grandfather’s statistics report!

5. The Cartagena Data Festival features real-live data heroes and fosters new partnerships. The Festival featured data innovators (such as terra-i using satellite data to track deforestation), NSOs on the leading edge of modernization and reform (such as Colombia and the Philippines), traditional actors using old data in new ways (such as the Inter-American Development Bank’s fantastic energy database), groups focused on citizen-generated data (such as The Data Shift and UN My World), private firms working with big data for social good (such asTelefónica), and many others—all reminding us that the data revolution is well underway and will not be stopped. Most importantly, it brought these actors together in one place. You could see the sparks flying as folks learned from each other and hatched plans together. The Festival gets my vote for best conference of a lifetime, with the perfect blend of substantive sessions, intense debate, learning, inspiration, new connections, and a lot of fun. (Stay tuned for a post from my colleague Kristen Stelljes and me for more on Cartagena).

This month full of headlines leaves no room for doubt—momentum is building fast on the data revolution. And just in time.

With the Financing for Development (FFD) conference in Addis Ababa in July, the agreement of Sustainable Development Goals in New York in September, and the Climate Summit in Paris in December, this is a big political year for global development. Data revolutionaries must seize this moment to push past vision, past roadmaps, to actual action and results…..(More)”

How Google and Facebook are finding victims of the Nepal earthquake


Caitlin Dewey in the Washington Post: “As the death toll from Saturday’s 7.8-magnitude Nepalese earthquake inches higher, help in finding and identifying missing persons has come from an unusual source: Silicon Valley tech giants.

Both Google and Facebook deployed collaborative, cellphone-based tools over the weekend to help track victims of the earthquake. In the midst of both company’s big push to bring Internet to the developing world, it’s an important illustration of exactly how powerful that connectivity could be. And yet, in a country like Nepal — where there are only 77 cellphone subscriptions per 100 people versus 96 in the U.S. and 125 in the U.K. — it’s also a reminder of how very far that effort still has to go.

Facebook Safety Check

Facebook’s Safety Check essentially lets users do two things, depending on where they are. Users in an area impacted by a natural disaster can log onto the site and mark themselves as “safe.” Meanwhile, users around the world can log into the site and check if any of their friends are in the impacted area. The tool was built by Japanese engineers in response to the 2011 earthquake and tsunami that devastated coastal Japan.

Facebook hasn’t publicized how many people have used the tool, though the network only has 4.4 million users in the country based on estimates by its ad platform. Notably, you must also a smartphone running the Facebook app to use this feature — and smartphone penetration in Nepal is quite low.

Google Person Finder

Like Safety Check, Google Person Finder is intended to connect people in a disaster area with friends and family around the world. Google’s five-year-old project also operates on a larger scale, however: It basically provides a massive, open platform to collaboratively track missing persons’ reports. Previously, Google’s deployed the tool to help victims in the wake of Typhoon Haiyan and the Boston bombing.