Mastercard’s Big Data For Good Initiative: Data Philanthropy On The Front Lines


Interview by Randy Bean of Shamina Singh: Much has been written about big data initiatives and the efforts of market leaders to derive critical business insights faster. Less has been written about initiatives by some of these same firms to apply big data and analytics to a different set of issues, which are not solely focused on revenue growth or bottom line profitability. While the focus of most writing has been on the use of data for competitive advantage, a small set of companies has been undertaking, with much less fanfare, a range of initiatives designed to ensure that data can be applied not just for corporate good, but also for social good.

One such firm is Mastercard, which describes itself as a technology company in the payments industry, which connects buyers and sellers in 210 countries and territories across the globe. In 2013 Mastercard launched the Mastercard Center for Inclusive Growth, which operates as an independent subsidiary of Mastercard and is focused on the application of data to a range of issues for social benefit….

In testimony before the Senate Committee on Foreign Affairs on May 4, 2017, Mastercard Vice Chairman Walt Macnee, who serves as the Chairman of the Center for Inclusive Growth, addressed issues of private sector engagement. Macnee noted, “The private sector and public sector can each serve as a force for good independently; however when the public and private sectors work together, they unlock the potential to achieve even more.” Macnee further commented, “We will continue to leverage our technology, data, and know-how in an effort to solve many of the world’s most pressing problems. It is the right thing to do, and it is also good for business.”…

Central to the mission of the Mastercard Center is the notion of “data philanthropy”. This term encompasses notions of data collaboration and data sharing and is at the heart of the initiatives that the Center is undertaking. The three cornerstones on the Center’s mandate are:

  • Sharing Data Insights– This is achieved through the concept of “data grants”, which entails granting access to proprietary insights in support of social initiatives in a way that fully protects consumer privacy.
  • Data Knowledge – The Mastercard Center undertakes collaborations with not-for-profit and governmental organizations on a range of initiatives. One such effort was in collaboration with the Obama White House’s Data-Driven Justice Initiative, by which data was used to help advance criminal justice reform. This initiative was then able, through the use of insights provided by Mastercard, to demonstrate the impact crime has on merchant locations and local job opportunities in Baltimore.
  • Leveraging Expertise – Similarly, the Mastercard Center has collaborated with private organizations such as DataKind, which undertakes data science initiatives for social good.Just this past month, the Mastercard Center released initial findings from its Data Exploration: Neighborhood Crime and Local Business initiative. This effort was focused on ways in which Mastercard’s proprietary insights could be combined with public data on commercial robberies to help understand the potential relationships between criminal activity and business closings. A preliminary analysis showed a spike in commercial robberies followed by an increase in bar and nightclub closings. These analyses help community and business leaders understand factors that can impact business success.Late last year, Ms. Singh issued A Call to Action on Data Philanthropy, in which she challenges her industry peers to look at ways in which they can make a difference — “I urge colleagues at other companies to review their data assets to see how they may be leveraged for the benefit of society.” She concludes, “the sheer abundance of data available today offers an unprecedented opportunity to transform the world for good.”….(More)

Democratic Resilience for a Populist Age


Helmut K. Anheier at Project Syndicate: “… many democracies are plagued by serious maladies – such as electoral gerrymandering, voter suppression, fraud and corruption, violations of the rule of law, and threats to judicial independence and press freedom – there is little agreement about which solutions should be pursued.

How to make our democracies more resilient, if not altogether immune, to anti-democratic threats is a central question of our time. …

Democratic resilience demands that citizens do more than bemoan deficiencies and passively await constitutional reform. It requires openness to change and innovation. Such changes may occur incrementally, but their aggregate effect can be immense…

Governments and citizens thus have a rich set of options – such as diversity quotas, automatic voter registration, and online referenda – for addressing democratic deficiencies. Moreover, there are measures that can also help citizens mount a defense of democracy against authoritarian assaults.

To that end, organizations can be created to channel protest and dissent into the democratic process, so that certain voices are not driven to the political fringe. And watchdog groups can oversee deliberative assemblies and co-governance efforts – such as participatory budgeting – to give citizens more direct access to decision-making. At the same time, core governance institutions, like central banks and electoral commissions, should be depoliticized, to prevent their capture by populist opportunists.

When properly applied, these measures can encourage consensus building and thwart special interests. Moreover, such policies can boost public trust and give citizens a greater sense of ownership vis-à-vis their government.

Of course, some political innovations that work in one context may cause real damage in another. Referenda, for example, are easily manipulated by demagogues. Assemblies can become gridlocked, and quotas can restrict voters’ choices. Fixing contemporary democracy will inevitably require experimentation and adaptation.

Still, recent research can help us along the way. The Governance Report 2017 has compiled a diverse list of democratic tools that can be applied in different contexts around the globe – by governments, policymakers, civil-society leaders, and citizens.

In his contribution to the report, German sociologist Claus Offe, Professor Emeritus of the Hertie School and Humboldt University identifies two fundamental priorities for all democracies. The first is to secure all citizens’ basic rights and ability to participate in civic life; the second is to provide a just and open society with opportunities for all citizens. As it happens, these two imperatives are linked: democratic government should be “of,” “by,” and for the people….(More)”.

Elsevier Is Becoming a Data Company. Should Universities Be Wary?


Paul Basken at The Chronicle of Higher Education: “As universities have slowly pushed their scientists to embrace open-access journals, publishers will need new profit centers. Elsevier appears well ahead of the pack in creating a network of products that scientists can use to record, share, store, and measure the value to others of the surging amounts of data they produce.

“Maybe all publishers are going, or wish they were” going, in the direction of becoming data companies, said Vincent Larivière, an associate professor of information science at the University of Montreal. “But Elsevier is the only one that is there.”

A Suite of Services

Universities also recognize the future of data. Their scientists are already seeing that widely and efficiently sharing data in fields such as cancer research has enabled accomplishments that have demonstrably saved lives.

In their eagerness to embrace that future, however, universities may not be paying enough attention to what their choices of systems may eventually cost them, warned Roger C. Schonfeld, a program director at Ithaka S+R. With its comprehensive data-services network, Mr. Schonfeld wrote earlier this year, Elsevier appears ready “to lock in scientists to a research workflow no less powerful than the strength of the lock-in libraries have felt to ‘big deal’ bundles.”….

Some open-access advocates say the situation points to an urgent need to create more robust nonprofit alternatives to Elsevier’s product line of data-compiling and sharing tools. But so far financial backing for the developmental work is thin. One of the best known attempts is the Open Science Framework, a web-based data interface built by the Center for Open Science, which has an annual budget of about $6 million, provided largely by foundations and other private donors.

In general, U.S. research universities — a $70 billion scientific enterprise — have not made major contributions to such projects. The Association of American Universities and the Association of Public and Land-grant Universities have, however, formed a team that’s begun studying the future of data sharing. So far, that effort has been focused on more basic steps such as establishing data-storage facilities, linking them together, and simply persuading scientists to take seriously the need to share data.…(More)”

Africa’s open data revolution hampered by challenges


Gilbert Nakweya at SciDevNet: “According to the inaugural Africa Data Revolution Report (ADRR), there is minimal or non-existent collaborations among data communities regarding the Sustainable Development Goals (SDGs) and Africa’s Agenda 2063.
…The report cites issues such as legal and policy frameworks, infrastructure, technology and interactions among key actors as challenges that confront data ecosystems of ten African countries studied: Cote d’Ivoire, Ethiopia, Kenya, Madagascar, Nigeria, Rwanda, Senegal, South Africa, Swaziland and Tanzania.

The ADRR was jointly published by the Economic Commission for Africa, United Nations Development Programme (UNDP), World Wide Web Foundation and Open Data for Development Network (OD4D).

“Open data is Africa’s biggest challenge,” says Nnenna Nwakanma, a senior policy manager at the US-headquartered World Wide Web Foundation, noting that open data revolution is key to Africa achieving the SDGs.

Nwakanma tells SciDev.Net that data revolution is built on access to information, the web, and to content, citing open data’s benefits such as governments functioning more efficiently, businesses innovating more and citizens participating in governance and demanding accountability.

Serge Kapto, a policy specialist on data from the UNDP, says that frameworks such as the African charter on statistics and the strategy for harmonisation of statistics in Africa adopted by the continent have laid the groundwork for an African data revolution…
Kapto adds that Africa is well positioned to reap the benefits of the data revolution for sustainable development and leapfrog technology to serve national and regional development priorities.

But, he explains, much work remains to be done to fully take advantage of the opportunity afforded by the data revolution for achieving development plans….(More)”

Chicago police see less violent crime after using predictive code


Jon Fingas at Engadget: “Law enforcement has been trying predictive policing software for a while now, but how well does it work when it’s put to a tough test? Potentially very well, according to Chicago police. The city’s 7th District police reportthat their use of predictive algorithms helped reduce the number of shootings 39 percent year-over-year in the first 7 months of 2017, with murders dropping by 33 percent. Three other districts didn’t witness as dramatic a change, but they still saw 15 to 29 percent reductions in shootings and a corresponding 9 to 18 percent drop in murders.

It mainly comes down to knowing where and when to deploy officers. One of the tools used in the 7th District, HunchLab, blends crime statistics with socioeconomic data, weather info and business locations to determine where crimes are likely to happen. Other tools (such as the Strategic Subject’s List and ShotSpotter) look at gang affiliation, drug arrest history and gunfire detection sensors.

If the performance holds, It’ll suggest that predictive policing can save lives when crime rates are particularly high, as they have been on Chicago’s South Side. However, both the Chicago Police Department and academics are quick to stress that algorithms are just one part of a larger solution. Officers still have be present, and this doesn’t tackle the underlying issues that cause crime, such as limited access to education and a lack of economic opportunity. Still, any successful reduction in violence is bound to be appreciated….(More)”.

Digital Decisions Tool


Center for Democracy and Technology (CDT): “Two years ago, CDT embarked on a project to explore what we call “digital decisions” – the use of algorithms, machine learning, big data, and automation to make decisions that impact individuals and shape society. Industry and government are applying algorithms and automation to problems big and small, from reminding us to leave for the airport to determining eligibility for social services and even detecting deadly diseases. This new era of digital decision-making has created a new challenge: ensuring that decisions made by computers reflect values like equality, democracy, and justice. We want to ensure that big data and automation are used in ways that create better outcomes for everyone, and not in ways that disadvantage minority groups.

The engineers and product managers who design these systems are the first line of defense against unfair, discriminatory, and harmful outcomes. To help mitigate harm at the design level, we have launched the first public version of our digital decisions tool. We created the tool to help developers understand and mitigate unintended bias and ethical pitfalls as they design automated decision-making systems.

About the digital decisions tool

This interactive tool translates principles for fair and ethical automated decision-making into a series of questions that can be addressed during the process of designing and deploying an algorithm. The questions address developers’ choices, such as what data to use to train an algorithm, what factors or features in the data to consider, and how to test the algorithm. They also ask about the systems and checks in place to assess risk and ensure fairness. These questions should provoke thoughtful consideration of the subjective choices that go into building an automated decision-making system and how those choices could result in disparate outcomes and unintended harms.

The tool is informed by extensive research by CDT and others about how algorithms and machine learning work, how they’re used, the potential risks of using them to make important decisions, and the principles that civil society has developed to ensure that digital decisions are fair, ethical, and respect civil rights. Some of this research is summarized on CDT’s Digital Decisions webpage….(More)”.

Building Digital Government Strategies


Book by Rodrigo Sandoval-Almazan et al: “This book provides key strategic principles and best practices to guide the design and implementation of digital government strategies. It provides a series of recommendations and findings to think about IT applications in government as a platform for information, services and collaboration, and strategies to avoid identified pitfalls. Digital government research suggests that information technologies have the potential to generate immense public value and transform the relationships between governments, citizens, businesses and other stakeholders. However, developing innovative and high impact solutions for citizens hinges on the development of strategic institutional, organizational and technical capabilities.

Thus far,  particular characteristics and problems of the public sector organization promote the development of poorly integrated and difficult to maintain applications. For example, governments maintain separate applications for open data, transparency, and public services, leading to duplication of efforts and a waste of resources. The costs associated with maintaining such sets of poorly integrated systems may limit the use of resources to future projects and innovation.

This book provides best practices and recommendations based on extensive research in both Mexico and the United States on how governments can develop a digital government strategy for creating public value, how to finance digital innovation in the public sector, how to building successful collaboration networks and foster citizen engagement, and how to correctly implement open government projects and open data. It will be of interest to researchers, practitioners, students, and public sector IT professionals that work in the design and implementation of technology-based projects and programs….(More)”.

Rise of the Government Chatbot


Zack Quaintance at Government Technology: “A robot uprising has begun, except instead of overthrowing mankind so as to usher in a bleak yet efficient age of cold judgment and colder steel, this uprising is one of friendly robots (so far).

Which is all an alarming way to say that many state, county and municipal governments across the country have begun to deploy relatively simple chatbots, aimed at helping users get more out of online public services such as a city’s website, pothole reporting and open data. These chatbots have been installed in recent months in a diverse range of places including Kansas City, Mo.; North Charleston, S.C.; and Los Angeles — and by many indications, there is an accompanying wave of civic tech companies that are offering this tech to the public sector.

They range from simple to complex in scope, and most of the jurisdictions currently using them say they are doing so on somewhat of a trial or experimental basis. That’s certainly the case in Kansas City, where the city now has a Facebook chatbot to help users get more out of its open data portal.

“The idea was never to create a final chatbot that was super intelligent and amazing,” said Eric Roche, Kansas City’s chief data officer. “The idea was let’s put together a good effort, and put it out there and see if people find it interesting. If they use it, get some lessons learned and then figure out — either in our city, or with developers, or with people like me in other cities, other chief data officers and such — and talk about the future of this platform.”

Roche developed Kansas City’s chatbot earlier this year by working after hours with Code for Kansas City, the local Code for America brigade — and he did so because since in the four-plus years the city’s open data program has been active, there have been regular concerns that the info available through it was hard to navigate, search and use for average citizens who aren’t data scientists and don’t work for the city (a common issue currently being addressed by many jurisdictions). The idea behind the Facebook chatbot is that Roche can program it with a host of answers to the most prevalent questions, enabling it to both help interested users and save him time for other work….

In North Charleston, S.C., the city has adopted a text-based chatbot, which goes above common 311-style interfaces by allowing users to report potholes or any other lapses in city services they may notice. It also allows them to ask questions, which it subsequently answers by crawling city websites and replying with relevant links, said Ryan Johnson, the city’s public relations coordinator.

North Charleston has done this by partnering with a local tech startup that has deep roots in the area’s local government. The company is called Citibot …

With Citibot, residents can report a pothole at 2 a.m., or they can get info about street signs or trash pickup sent right to their phones.

There are also more complex chatbot technologies taking hold at both the civic and state levels, in Los Angeles and Mississippi, to be exact.

Mississippi’s chatbot is called Missi, and its capabilities are vast and nuanced. Residents can even use it for help submitting online payments. It’s accessible by clicking a small chat icon on the side of the website.

Back in May, Los Angeles rolled out Chip, or City Hall Internet Personality, on the Los Angeles Business Assistance Virtual Network. The chatbot aims to assist visitors by operating as a 24/7 digital assistant for visitors to the site, helping them navigate it and better understand its services by answering their inquiries. It is capable of presenting info from anywhere on the site, and it can even go so far as helping users fill out forms or set up email alerts….(More)”

Algorithmic Transparency for the Smart City


Paper by Robert Brauneis and Ellen P. Goodman: “Emerging across many disciplines are questions about algorithmic ethics – about the values embedded in artificial intelligence and big data analytics that increasingly replace human decisionmaking. Many are concerned that an algorithmic society is too opaque to be accountable for its behavior. An individual can be denied parole or denied credit, fired or not hired for reasons she will never know and cannot be articulated. In the public sector, the opacity of algorithmic decisionmaking is particularly problematic both because governmental decisions may be especially weighty, and because democratically-elected governments bear special duties of accountability. Investigative journalists have recently exposed the dangerous impenetrability of algorithmic processes used in the criminal justice field – dangerous because the predictions they make can be both erroneous and unfair, with none the wiser.

We set out to test the limits of transparency around governmental deployment of big data analytics, focusing our investigation on local and state government use of predictive algorithms. It is here, in local government, that algorithmically-determined decisions can be most directly impactful. And it is here that stretched agencies are most likely to hand over the analytics to private vendors, which may make design and policy choices out of the sight of the client agencies, the public, or both. To see just how impenetrable the resulting “black box” algorithms are, we filed 42 open records requests in 23 states seeking essential information about six predictive algorithm programs. We selected the most widely-used and well-reviewed programs, including those developed by for-profit companies, nonprofits, and academic/private sector partnerships. The goal was to see if, using the open records process, we could discover what policy judgments these algorithms embody, and could evaluate their utility and fairness.

To do this work, we identified what meaningful “algorithmic transparency” entails. We found that in almost every case, it wasn’t provided. Over-broad assertions of trade secrecy were a problem. But contrary to conventional wisdom, they were not the biggest obstacle. It will not usually be necessary to release the code used to execute predictive models in order to dramatically increase transparency. We conclude that publicly-deployed algorithms will be sufficiently transparent only if (1) governments generate appropriate records about their objectives for algorithmic processes and subsequent implementation and validation; (2) government contractors reveal to the public agency sufficient information about how they developed the algorithm; and (3) public agencies and courts treat trade secrecy claims as the limited exception to public disclosure that the law requires. Although it would require a multi-stakeholder process to develop best practices for record generation and disclosure, we present what we believe are eight principal types of information that such records should ideally contain….(More)”.

Getting on the map: How to fix the problem with addresses


Joshua Howgego at New Scientist: “Kwandengezi is a beguiling neighbourhood on the outskirts of Durban. Its ramshackle dwellings are spread over rolling green hills, with dirt roads winding in between. Nothing much to put it on the map. Until last year, that is, when weird signs started sprouting, nailed to doors, stapled to fences or staked in front of houses. Each consisted of three seemingly random words. Cutaway.jazz.wording said one; tokens.painted.enacted read another.

In a neighbourhood where houses have no numbers and the dirt roads no names, these signs are the fastest way for ambulances to locate women going into labour and who need ferrying to the nearest hospital. The hope is that signs like this will save lives and be adopted elsewhere. For the residents of KwaNdengezi in South Africa aren’t alone – recent estimates suggest that only 80 or so countries worldwide have an up-to-date addressing system. And even where one exists, it isn’t always working as well as it could.

Poor addresses aren’t simply confusing: they frustrate businesses and can shave millions of dollars off economic output. That’s why there’s a growing feeling that we need to reinvent the address – and those makeshift three-word signs are just the beginning.

“Poor addresses frustrate businesses and can shave millions of dollars off economic output”

In itself, an address is a simple thing: its purpose is to unambiguously identify a point on Earth’s surface. But, it also forms a crucial part of the way societies are managed. Governments use lists of addresses to work out how many people they need to serve; without an address by your name, you can’t apply for a passport…(More)”.