What Should We Do About Big Data Leaks?


Paul Ford at the New Republic: “I have a great fondness for government data, and the government has a great fondness for making more of it. Federal elections financial data, for example, with every contribution identified, connected to a name and address. Or the results of the census. I don’t know if you’ve ever had the experience of downloading census data but it’s pretty exciting. You can hold America on your hard drive! Meditate on the miracles of zip codes, the way the country is held together and addressable by arbitrary sets of digits.

You can download whole books, in PDF format, about the foreign policy of the Reagan Administration as it related to Russia. Negotiations over which door the Soviet ambassador would use to enter a building. Gigabytes and gigabytes of pure joy for the ephemeralist. The government is the greatest creator of ephemera ever.

Consider the Financial Crisis Inquiry Commission, or FCIC, created in 2009 to figure out exactly how the global economic pooch was screwed. The FCIC has made so much data, and has done an admirable job (caveats noted below) of arranging it. So much stuff. There are reams of treasure on a single FCIC web site, hosted at Stanford Law School: Hundreds of MP3 files, for example, with interviews with Jamie Dimonof JPMorgan Chase and Lloyd Blankfein of Goldman Sachs. I am desperate to find  time to write some code that automatically extracts random audio snippets from each and puts them on top of a slow ambient drone with plenty of reverb, so that I can relax to the dulcet tones of the financial industry explaining away its failings. (There’s a Paul Krugman interview that I assume is more critical.)

The recordings are just the beginning. They’ve released so many documents, and with the documents, a finding aid that you can download in handy PDF format, which will tell you where to, well, find things, pointing to thousands of documents. That aid alone is 1,439 pages.

Look, it is excellent that this exists, in public, on the web. But it also presents a very contemporary problem: What is transparency in the age of massive database drops? The data is available, but locked in MP3s and PDFs and other documents; it’s not searchable in the way a web page is searchable, not easy to comment on or share.

Consider the WikiLeaks release of State Department cables. They were exhausting, there were so many of them, they were in all caps. Or the trove of data Edward Snowden gathered on aUSB drive, or Chelsea Manning on CD. And the Ashley Madison leak, spread across database files and logs of credit card receipts. The massive and sprawling Sony leak, complete with whole email inboxes. And with the just-released Panama Papers, we see two exciting new developments: First, the consortium of media organizations that managed the leak actually came together and collectively, well, branded the papers, down to a hashtag (#panamapapers), informational website, etc. Second, the size of the leak itself—2.5 terabytes!—become a talking point, even though that exact description of what was contained within those terabytes was harder to understand. This, said the consortia of journalists that notably did not include The New York Times, The Washington Post, etc., is the big one. Stay tuned. And we are. But the fact remains: These artifacts are not accessible to any but the most assiduous amateur conspiracist; they’re the domain of professionals with the time and money to deal with them. Who else could be bothered?

If you watched the movie Spotlight, you saw journalists at work, pawing through reams of documents, going through, essentially, phone books. I am an inveterate downloader of such things. I love what they represent. And I’m also comfortable with many-gigabyte corpora spread across web sites. I know how to fetch data, how to consolidate it, and how to search it. I share this skill set with many data journalists, and these capacities have, in some ways, become the sole province of the media. Organs of journalism are among the only remaining cultural institutions that can fund investigations of this size and tease the data apart, identifying linkages and thus constructing informational webs that can, with great effort, be turned into narratives, yielding something like what we call “a story” or “the truth.” 

Spotlight was set around 2001, and it features a lot of people looking at things on paper. The problem has changed greatly since then: The data is everywhere. The media has been forced into a new cultural role, that of the arbiter of the giant and semi-legal database. ProPublica, a nonprofit that does a great deal of data gathering and data journalism and then shares its findings with other media outlets, is one example; it funded a project called DocumentCloud with other media organizations that simplifies the process of searching through giant piles of PDFs (e.g., court records, or the results of Freedom of Information Act requests).

At some level the sheer boredom and drudgery of managing these large data leaks make them immune to casual interest; even the Ashley Madison leak, which I downloaded, was basically an opaque pile of data and really quite boring unless you had some motive to poke around.

If this is the age of the citizen journalist, or at least the citizen opinion columnist, it’s also the age of the data journalist, with the news media acting as product managers of data leaks, making the information usable, browsable, attractive. There is an uneasy partnership between leakers and the media, just as there is an uneasy partnership between the press and the government, which would like some credit for its efforts, thank you very much, and wouldn’t mind if you gave it some points for transparency while you’re at it.

Pause for a second. There’s a glut of data, but most of it comes to us in ugly formats. What would happen if the things released in the interest of transparency were released in actual transparent formats?…(More)”

Automating power: Social bot interference in global politics


Samuel C. Woolley at First Monday: “Over the last several years political actors worldwide have begun harnessing the digital power of social bots — software programs designed to mimic human social media users on platforms like Facebook, Twitter, and Reddit. Increasingly, politicians, militaries, and government-contracted firms use these automated actors in online attempts to manipulate public opinion and disrupt organizational communication. Politicized social bots — here ‘political bots’ — are used to massively boost politicians’ follower levels on social media sites in attempts to generate false impressions of popularity. They are programmed to actively and automatically flood news streams with spam during political crises, elections, and conflicts in order to interrupt the efforts of activists and political dissidents who publicize and organize online. They are used by regimes to send out sophisticated computational propaganda. This paper conducts a content analysis of available media articles on political bots in order to build an event dataset of global political bot deployment that codes for usage, capability, and history. This information is then analyzed, generating a global outline of this phenomenon. This outline seeks to explain the variety of political bot-oriented strategies and presents details crucial to building understandings of these automated software actors in the humanities, social and computer sciences….(More)”

Selected Readings on Data and Humanitarian Response


By Prianka Srinivasan and Stefaan G. Verhulst *

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data and humanitarian response was originally published in 2016.

Data, when used well in a trusted manner, allows humanitarian organizations to innovate how to respond to emergency events, including better coordination of post-disaster relief efforts, the ability to harness local knowledge to create more targeted relief strategies, and tools to predict and monitor disasters in real time. Consequently, in recent years both multinational groups and community-based advocates have begun to integrate data collection and evaluation strategies into their humanitarian operations, to better and more quickly respond to emergencies. However, this movement poses a number of challenges. Compared to the private sector, humanitarian organizations are often less equipped to successfully analyze and manage big data, which pose a number of risks related to the security of victims’ data. Furthermore, complex power dynamics which exist within humanitarian spaces may be further exacerbated through the introduction of new technologies and big data collection mechanisms. In the below we share:

  • Selected Reading List (summaries and hyperlinks)
  • Annotated Selected Reading List
  • Additional Readings

Selected Reading List  (summaries in alphabetical order)

Data and Humanitarian Response

Risks of Using Big Data in Humanitarian Context

Annotated Selected Reading List (in alphabetical order)

Karlsrud, John. “Peacekeeping 4.0: Harnessing the Potential of Big Data, Social Media, and Cyber Technologies.” Cyberspace and International Relations, 2013. http://bit.ly/235Qb3e

  • This chapter from the book “Cyberspace and International Relations” suggests that advances in big data give humanitarian organizations unprecedented opportunities to prevent and mitigate natural disasters and humanitarian crises. However, the sheer amount of unstructured data necessitates effective “data mining” strategies for multinational organizations to make the most use of this data.
  • By profiling some civil-society organizations who use big data in their peacekeeping efforts, Karlsrud suggests that these community-focused initiatives are leading the movement toward analyzing and using big data in countries vulnerable to crisis.
  • The chapter concludes by offering ten recommendations to UN peacekeeping forces to best realize the potential of big data and new technology in supporting their operations.

Mancini, Fancesco. “New Technology and the prevention of Violence and Conflict.” International Peace Institute, 2013. http://bit.ly/1ltLfNV

  • This report from the International Peace Institute looks at five case studies to assess how information and communications technologies (ICTs) can help prevent humanitarian conflicts and violence. Their findings suggest that context has a significant impact on the ability for these ICTs for conflict prevention, and any strategies must take into account the specific contingencies of the region to be successful.
  • The report suggests seven lessons gleaned from the five case studies:
    • New technologies are just one in a variety of tools to combat violence. Consequently, organizations must investigate a variety of complementary strategies to prevent conflicts, and not simply rely on ICTs.
    • Not every community or social group will have the same relationship to technology, and their ability to adopt new technologies are similarly influenced by their context. Therefore, a detailed needs assessment must take place before any new technologies are implemented.
    • New technologies may be co-opted by violent groups seeking to maintain conflict in the region. Consequently, humanitarian groups must be sensitive to existing political actors and be aware of possible negative consequences these new technologies may spark.
    • Local input is integral to support conflict prevention measures, and there exists need for collaboration and awareness-raising with communities to ensure new technologies are sustainable and effective.
    • Information shared between civil-society has more potential to develop early-warning systems. This horizontal distribution of information can also allow communities to maintain the accountability of local leaders.

Meier, Patrick. “Digital humanitarians: how big data is changing the face of humanitarian response.” Crc Press, 2015. http://amzn.to/1RQ4ozc

  • This book traces the emergence of “Digital Humanitarians”—people who harness new digital tools and technologies to support humanitarian action. Meier suggests that this has created a “nervous system” to connect people from disparate parts of the world, revolutionizing the way we respond to humanitarian crises.
  • Meier argues that such technology is reconfiguring the structure of the humanitarian space, where victims are not simply passive recipients of aid but can contribute with other global citizens. This in turn makes us more humane and engaged people.

Robertson, Andrew and Olson, Steve. “Using Data Sharing to Improve Coordination in Peacebuilding.” United States Institute for Peace, 2012. http://bit.ly/235QuLm

  • This report functions as an overview of a roundtable workshop on Technology, Science and Peace Building held at the United States Institute of Peace. The workshop aimed to investigate how data-sharing techniques can be developed for use in peace building or conflict management.
  • Four main themes emerged from discussions during the workshop:
    • “Data sharing requires working across a technology-culture divide”—Data sharing needs the foundation of a strong relationship, which can depend on sociocultural, rather than technological, factors.
    • “Information sharing requires building and maintaining trust”—These relationships are often built on trust, which can include both technological and social perspectives.
    • “Information sharing requires linking civilian-military policy discussions to technology”—Even when sophisticated data-sharing technologies exist, continuous engagement between different stakeholders is necessary. Therefore, procedures used to maintain civil-military engagement should be broadened to include technology.
    • “Collaboration software needs to be aligned with user needs”—technology providers need to keep in mind the needs of its users, in this case peacebuilders, in order to ensure sustainability.

United Nations Independent Expert Advisory Group on a Data Revolution for Sustainable Development. “A World That Counts, Mobilizing the Data Revolution.” 2014. https://bit.ly/2Cb3lXq

  • This report focuses on the potential benefits and risks data holds for sustainable development. Included in this is a strategic framework for using and managing data for humanitarian purposes. It describes a need for a multinational consensus to be developed to ensure data is shared effectively and efficiently.
  • It suggests that “people who are counted”—i.e., those who are included in data collection processes—have better development outcomes and a better chance for humanitarian response in emergency or conflict situations.

Katie Whipkey and Andrej Verity. “Guidance for Incorporating Big Data into Humanitarian Operations.” Digital Humanitarian Network, 2015. http://bit.ly/1Y2BMkQ

  • This report produced by the Digital Humanitarian Network provides an overview of big data, and how humanitarian organizations can integrate this technology into their humanitarian response. It primarily functions as a guide for organizations, and provides concise, brief outlines of what big data is, and how it can benefit humanitarian groups.
  • The report puts forward four main benefits acquired through the use of big data by humanitarian organizations: 1) the ability to leverage real-time information; 2) the ability to make more informed decisions; 3) the ability to learn new insights; 4) the ability for organizations to be more prepared.
  • It goes on to assess seven challenges big data poses for humanitarian organizations: 1) geography, and the unequal access to technology across regions; 2) the potential for user error when processing data; 3) limited technology; 4) questionable validity of data; 5) underdeveloped policies and ethics relating to data management; 6) limitations relating to staff knowledge.

Risks of Using Big Data in Humanitarian Context
Crawford, Kate, and Megan Finn. “The limits of crisis data: analytical and ethical challenges of using social and mobile data to understand disasters.” GeoJournal 80.4, 2015. http://bit.ly/1X0F7AI

  • Crawford & Finn present a critical analysis of the use of big data in disaster management, taking a more skeptical tone to the data revolution facing humanitarian response.
  • They argue that though social and mobile data analysis can yield important insights and tools in crisis events, it also presents a number of limitations which can lead to oversights being made by researchers or humanitarian response teams.
  • Crawford & Finn explore the ethical concerns the use of big data in disaster events introduces, including issues of power, privacy, and consent.
  • The paper concludes by recommending that critical data studies, such as those presented in the paper, be integrated into crisis event research in order to analyze some of the assumptions which underlie mobile and social data.

Jacobsen, Katja Lindskov (2010) Making design safe for citizens: A hidden history of humanitarian experimentation. Citizenship Studies 14.1: 89-103. http://bit.ly/1YaRTwG

  • This paper explores the phenomenon of “humanitarian experimentation,” where victims of disaster or conflict are the subjects of experiments to test the application of technologies before they are administered in greater civilian populations.
  • By analyzing the particular use of iris recognition technology during the repatriation of Afghan refugees to Pakistan in 2002 to 2007, Jacobsen suggests that this “humanitarian experimentation” compromises the security of already vulnerable refugees in order to better deliver biometric product to the rest of the world.

Responsible Data Forum. “Responsible Data Reflection Stories: An Overview.” http://bit.ly/1Rszrz1

  • This piece from the Responsible Data forum is primarily a compilation of “war stories” which follow some of the challenges in using big data for social good. By drawing on these crowdsourced cases, the Forum also presents an overview which makes key recommendations to overcome some of the challenges associated with big data in humanitarian organizations.
  • It finds that most of these challenges occur when organizations are ill-equipped to manage data and new technologies, or are unaware about how different groups interact in digital spaces in different ways.

Sandvik, Kristin Bergtora. “The humanitarian cyberspace: shrinking space or an expanding frontier?” Third World Quarterly 37:1, 17-32, 2016. http://bit.ly/1PIiACK

  • This paper analyzes the shift toward more technology-driven humanitarian work, where humanitarian work increasingly takes place online in cyberspace, reshaping the definition and application of aid. This has occurred along with what many suggest is a shrinking of the humanitarian space.
  • Sandvik provides three interpretations of this phenomena:
    • First, traditional threats remain in the humanitarian space, which are both modified and reinforced by technology.
    • Second, new threats are introduced by the increasing use of technology in humanitarianism, and consequently the humanitarian space may be broadening, not shrinking.
    • Finally, if the shrinking humanitarian space theory holds, cyberspace offers one example of this, where the increasing use of digital technology to manage disasters leads to a contraction of space through the proliferation of remote services.

Additional Readings on Data and Humanitarian Response

* Thanks to: Kristen B. Sandvik; Zara Rahman; Jennifer Schulte; Sean McDonald; Paul Currion; Dinorah Cantú-Pedraza and the Responsible Data Listserve for valuable input.

The Bottom of the Data Pyramid: Big Data and the Global South


Payal Arora at the International Journal of Communication: “To date, little attention has been given to the impact of big data in the Global South, about 60% of whose residents are below the poverty line. Big data manifests in novel and unprecedented ways in these neglected contexts. For instance, India has created biometric national identities for her 1.2 billion people, linking them to welfare schemes, and social entrepreneurial initiatives like the Ushahidi project that leveraged crowdsourcing to provide real-time crisis maps for humanitarian relief.

While these projects are indeed inspirational, this article argues that in the context of the Global South there is a bias in the framing of big data as an instrument of empowerment. Here, the poor, or the “bottom of the pyramid” populace are the new consumer base, agents of social change instead of passive beneficiaries. This neoliberal outlook of big data facilitating inclusive capitalism for the common good sidelines critical perspectives urgently needed if we are to channel big data as a positive social force in emerging economies. This article proposes to assess these new technological developments through the lens of databased democracies, databased identities, and databased geographies to make evident normative assumptions and perspectives in this under-examined context….(More)”.

How to train Public Entrepreneurs


10 Lessons : “…The GovLab and its network of 25 world-class coaches and over 100 mentors helped 446 participants in more thana dozen US cities and thirty foreign countries to take a public interest technology project from idea to implementation. In the process, we ‘ve learned a lot about the need for new ways of training the next generation of leaders and problem solvers.

Our aim has been to aid public entrepreneurs — passionate and innovative people who wish to take advantage of new technology to do good in the world. That’s why we measure success, not by the number of participants in a class, but by the project’s participants create and the impact those projects have on communities….

Lesson 1: There is growing, and unmet, demand for training a new kind of public servant: the public entrepreneur…

Lesson 2: Tap the distributed supply of talent and expertise to accelerate learning…

Lesson 3:  Create new methods for training public entrepreneurs to solve problems…

Lesson 4:  Develop tools to help public interest innovators “cross the chasm” from idea to implementation…

Lesson 5:  Teach collaboration and partnering for change…

Lesson 6:  In order to be successful, public entrepreneurs must be able to define the problem — a skill widely lacking…

Lesson 7:  Connecting innovators and alumni with one another generates a lasting public infrastructure that can help solve problems more effectively…

Lesson 8:  Pedagogical priorities include making problem solving more data driven and evidence based….

Lesson 9:  The demand and supply are global — which requires a global mindset and platform in order to learn what has worked elsewhere and why…

Lesson 10:  Collaboration and coordination among anchor organizations is key to meeting the demand and coordinating the supply….(More)

Innovating for pro-poor services: Why politics matter


Nathaniel Mason, Clare Cummings and Julian Doczi for ODI insights: “To solve sustainable development challenges, such as the provision of universal access to basic services, we need new ideas, as well as old ideas applied in new ways and new places. The pace of global innovation, particularly digital innovation, is generating optimism, positioning the world at the start of the ‘Fourth Industrial Revolution’.1 Innovation can make basic services cheaper, more accessible, more relevant and more desirable for poor people. However, we also know few innovations lead to sustainable, systemic change. The barriers the this are often political – including problems related to motivation, power and collective action. Yet, just as political factors can prevent innovations from being widely adopted, politically smart approaches can help in navigating and mitigating these challenges. And, because innovations can alter the balance of power in societies and markets, they can both provoke new and challenging politics themselves and also help unlock systemic political change. When and why does politics affect innovation? What does this mean for donors, foundations and impact investors backing innovations for development?…(More)

UN statistics commission agrees starting point for SDG oversight


Emma Rumney at Public Finance: “The United Nations Statistical Commission agreed on a set of 230 preliminary indicators to measure progress towards the 17 Sustainable Development Goals published last September.

Wu Hongbo, under secretary general of the UN Department of Economic and Social Affairs, of which the UKSC is part, said “completing the indicator framework is not the end of the story – on the contrary, it is the beginning”.

Hongbo said it was necessary to acknowledge that developing a high-quality set of indicators is a technical and necessarily continuous process, “with refinements and improvements” made as “knowledge improves and new data sources become available”.

One challenge will entail the effective disaggregation of data, by income, sex, age, race, ethnicity, migratory status, disability, geographic location and more, to allow coverage of specific sectors of the population.

This will be essential if the SDGs are to be implemented successfully.

Hongbo said this will require “an unprecedented amount of data to be produced and analysed”, posing a significant challenge to national statistics systems in both the developing and developed world.

National and regional authorities will also have to develop their own indicators for regional, national and sub-national monitoring, as the global indicators won’t be able to account for different realities, capacities and levels of development.

The statistical commission will now submit its initial global indicator framework to the UN’s Economic and Social Council and General Assembly for adoption….(More)

See also:

Crowdsourcing Site Works to Detect Spread of Zika


Suzanne Tracy at Scientific Computing Source: “Last month, the Flu Near You crowdsourcing tool expanded its data collection to include Zika, chikungunya and dengue symptoms, such as eye pain, yellow skin/eyes and joint/bone pain. Flu Near You is a free and anonymous Web site and mobile application that allows the public to report their health information by completing brief weekly surveys.

Created by epidemiologists at Harvard, Boston Children’s Hospital and The Skoll Global Threats Fund, the novel participatory disease surveillance tool is intended to complement existing surveillance systems by directly engaging the public in public health reporting. As such, it relies on voluntary participation from the general public, asking participants to take a few seconds each week to report whether they or their family members have been healthy or sick.

Using participant-reported symptoms, the site graphs and maps this information to provide local and national views of illness. Thousands of reports are analyzed and mapped to provide public health officials and researchers with real-time, anonymous information that could help prevent the next pandemic.

The survey, which launched in 2011, is conducted year-round for several reasons.

  • First, it is possible for an influenza outbreak to occur outside of the traditional flu season. For instance, the first wave of pandemic H1N1 hit in the spring of 2009. The project wants to capture any emerging outbreak, should something similar occur again.
  • Second, the project’s symptoms-based health forms allow it to monitor other diseases, such as the recently-added Zika, chikungunya and dengue, which may have different seasons than influenza….(More)

See also: http://flunearyou.org and video: Fight the flu. Save lives

Accelerating Discovery with New Tools and Methods for Next Generation Social Science


DARPA: “The explosive growth of global digital connectivity has opened new possibilities for designing and conducting social science research. Once limited by practical constraints to experiments involving just a few dozen participants—often university students or other easily available groups—or to correlational studies of large datasets without any opportunity for determining causation, scientists can now engage thousands of diverse volunteers online and explore an expanded range of important topics and questions. If new tools and methods for harnessing virtual or alternate reality and massively distributed platforms could be developed and objectively validated, many of today’s most important and vexing challenges in social science—such as identifying the primary drivers of social cooperation, instability and resilience—might be made more tractable, with benefits for domains as broad as national security, public health, and economics.

To begin to assess the research opportunities provided by today’s web-connected world and advanced technologies, DARPA today launched its Next Generation Social Science (NGS2) program. The program aims to build and evaluate new methods and tools to advance rigorous, reproducible social science studies at scales necessary to develop and validate causal models of human social behaviors. The program will draw upon and build across a wide array of disciplines—including social sciences like sociology, economics, political science, anthropology, and psychology, as well as information and computer sciences, physics, biology and math.

As an initial focus, NGS2 will challenge researchers to develop and use these new tools and methods to identify causal mechanisms of “collective identity” formation—how a group of individuals becomes a unified whole, and how under certain circumstances that community breaks down into a chaotic mix of disconnected individuals.

“Social science has done a remarkable job of helping us understand ourselves as the highly social creatures we are, but the field has long acknowledged and rued some frustrating research limitations, including technical and logistical limits to experimentally studying large, representative populations and the challenges of replicating key studies to better understand the limits of our knowledge,” said Adam Russell, DARPA program manager. “As a result, it’s been difficult for social scientists to determine what variables matter most in explaining their observations of human social systems and to move from documenting correlation to identifying causation.”

On top of those methodological and analytic limitations, Russell said, the field is inherently challenged because of its subject matter: human beings, with all their complex variability and seeming unpredictability. “Physicists have joked about how much more difficult their field would be if atoms or electrons had personalities, but that’s exactly the situation faced by social scientists,” he said.

By developing and applying new methods and models to larger, more diverse, and more representative groups of individuals—such as through web-based global gaming and alternate reality platforms—NGS2 seeks to validate new tools that may empower social science in the same way that sophisticated telescopes and microscopes have helped advance astronomy and biology….(More)”

How tech is forcing firms to be better global citizens


Catherine Lawson at the BBC: “…technology is forcing companies to up their game and interact with communities more directly and effectively….

Platforms such as Kritical Mass have certainly given a fillip to the idea of crowd-supported philanthropy, attracting individuals and corporate sponsors to its projects, whether that’s saving vultures in Kenya or bringing solar power to rural communities in west Africa.

Sponsors can offer funding, volunteers, expertise or marketing. So rather than imposing corporate ideas of “do-gooding” on communities in a patronising manner, firms can simply respond to demand.

HelpfulPeeps has pushed its volunteering platform into more than 40 countries worldwide, connecting people who want to share their time, knowledge and skills with each other for free.

In the UK, online platform Neighbourly connects community projects and charities with companies and people willing to volunteer their resources. For example, Starbucks has pledged 2,500 days of volunteering and has so far backed 70 community projects….

Judging by the strong public appetite for supporting good causes and campaigning against injustice on sites such as Change.org, Avaaz.org, JustGiving andGoFundMe, his assessment appears to be correct.

And LinkedIn says millions of members have signalled on their profiles that they want to serve on a non-profit board or use their skills to volunteer….

Tech companies in particular are offering expertise and skills to good causes as way of making a tangible difference.

For example, in January, Microsoft announced that through its new organisation,Microsoft Philanthropies, it will donate $1bn-worth (£700m) of cloud computing resources to serve non-profits and university researchers over the next three years…

And data analytics specialist Applied Predictive Technologies (APT) has offered its data-crunching skills to help the Capital Area Food Bank charity distribute food more efficiently to hungry people around the Washington DC area.

APT used data to develop a “hunger heat map” to help CAFB target resources and plan for future demand better.

In another project, APT helped The Cara Program – a Chicago-based charity providing training and job placements to people affected by homelessness or poverty – evaluate what made its students more employable….

And Launch, an open platform jointly founded by Nasa, Nike, the US Agency for International Development, and the US Department of State aims to provide support for start-ups and “inspire innovation”.

In the age of internet transparency, it seems corporates no longer have anywhere to hide – a spot of CSR whitewashing is not going to cut it anymore….(More)”.