The frontiers of data interoperability for sustainable development


Report from the Joined-Up Data Standards [JUDS] project: “…explores where progress has been made, what challenges still remain, and how the new Collaborative on SDG Data Interoperability will play a critical role in moving forward the agenda for interoperability policy.

There is an ever-growing need for a more holistic picture of development processes worldwide and interoperability solutions that can be scaled, driven by global development agendas such as the 2030 Agenda and the Open Data movement. This requires the ability to join up data across multiple data sources and standards to create actionable information.

Solutions that create value for front-line decision makers — health centre managers, local school authorities or water and sanitation committees, for example, and those engaged in government accountability – will be crucial to meet the data needs of the SDGs, and do so in an internationally comparable way. While progress has been made at both a national and international level, moving from principle to practice by embedding interoperability into day-to-day work continues to present challenges.

Based on research and learning generated by the JUDS project team at Development Initiatives and Publish What You Fund, as well as inputs from interviews with key stakeholders, this report aims to provide an overview of the different definitions and components of interoperability and why it is important, and an outline of the current policy landscape.

We offer a set of guiding principles that we consider essential to implementing interoperability, and contextualise the five frontiers of interoperability for sustainable development that we have identified. The report also offers recommendations on what the role of the Collaborative could be in this fast-evolving landscape….(More)”.

Leveraging the disruptive power of artificial intelligence for fairer opportunities


Makada Henry-Nickie at Brookings: “According to President Obama’s Council of Economic Advisers (CEA), approximately 3.1 million jobs will be rendered obsolete or permanently altered as a consequence of artificial intelligence technologies. Artificial intelligence (AI) will, for the foreseeable future, have a significant disruptive impact on jobs. That said, this disruption can create new opportunities if policymakers choose to harness them—including some with the potential to help address long-standing social inequities. Investing in quality training programs that deliver premium skills, such as computational analysis and cognitive thinking, provides a real opportunity to leverage AI’s disruptive power.

AI’s disruption presents a clear challenge: competition to traditional skilled workers arising from the cross-relevance of data scientists and code engineers, who can adapt quickly to new contexts. Data analytics has become an indispensable feature of successful companies across all industries. ….

Investing in high-quality education and training programs is one way that policymakers proactively attempt to address the workforce challenges presented by artificial intelligence. It is essential that we make affirmative, inclusive choices to ensure that marginalized communities participate equitably in these opportunities.

Policymakers should prioritize understanding the demographics of those most likely to lose jobs in the short-run. As opposed to obsessively assembling case studies, we need to proactively identify policy entrepreneurs who can conceive of training policies that equip workers with technical skills of “long-game” relevance. As IBM points out, “[d]ata democratization impacts every career path, so academia must strive to make data literacy an option, if not a requirement, for every student in any field of study.”

Machines are an equal opportunity displacer, blind to color and socioeconomic status. Effective policy responses require collaborative data collection and coordination among key stakeholders—policymakers, employers, and educational institutions—to  identify at-risk worker groups and to inform workforce development strategies. Machine substitution is purely an efficiency game in which workers overwhelmingly lose. Nevertheless, we can blunt these effects by identifying critical leverage points….

Policymakers can choose to harness AI’s disruptive power to address workforce challenges and redesign fair access to opportunity simultaneously. We should train our collective energies on identifying practical policies that update our current agrarian-based education model, which unfairly disadvantages children from economically segregated neighborhoods…(More)”

Open Data in Developing Economies: Toward Building an Evidence Base on What Works and How


New book by Stefaan Verhulst and Andrew Young: “Recent years have witnessed considerable speculation about the potential of open data to bring about wide-scale transformation. The bulk of existing evidence about the impact of open data, however, focuses on high-income countries. Much less is known about open data’s role and value in low- and middle-income countries, and more generally about its possible contributions to economic and social development.

Open Data in Developing Economies features in-depth case studies on how open data is having an impact across Screen Shot 2017-11-14 at 5.41.30 AMthe developing world-from an agriculture initiative in Colombia to data-driven healthcare
projects in Uganda and South Africa to crisis response in Nepal. The analysis built on these case studies aims to create actionable intelligence regarding:

(a) the conditions under which open data is most (and least) effective in development, presented in the form of a Periodic Table of Open Data;

(b) strategies to maximize the positive contributions of open data to development; and

(c) the means for limiting open data’s harms on developing countries.

Endorsements:

“An empirically grounded assessment that helps us move beyond the hype that greater access to information can improve the lives of people and outlines the enabling factors for open data to be leveraged for development.”-Ania Calderon, Executive Director, International Open Data Charter

“This book is compulsory reading for practitioners, researchers and decision-makers exploring how to harness open data for achieving development outcomes. In an intuitive and compelling way, it provides valuable recommendations and critical reflections to anyone working to share the benefits of an increasingly networked and data-driven society.”-Fernando Perini, Coordinator of the Open Data for Development (OD4D) Network, International Development Research Centre, Canada

Download full-text PDF – See also: http://odimpact.org/

Transatlantic Data Privacy


Paul M. Schwartz and Karl-Nikolaus Peifer in Georgetown Law Journal: “International flows of personal information are more significant than ever, but differences in transatlantic data privacy law imperil this data trade. The resulting policy debate has led the EU to set strict limits on transfers of personal data to any non-EU country—including the United States—that lacks sufficient privacy protections. Bridging the transatlantic data divide is therefore a matter of the greatest significance.

In exploring this issue, this Article analyzes the respective legal identities constructed around data privacy in the EU and the United States. It identifies profound differences in the two systems’ images of the individual as bearer of legal interests. The EU has created a privacy culture around “rights talk” that protects its “datasubjects.” In the EU, moreover, rights talk forms a critical part of the postwar European project of creating the identity of a European citizen. In the United States, in contrast, the focus is on a “marketplace discourse” about personal information and the safeguarding of “privacy consumers.” In the United States, data privacy law focuses on protecting consumers in a data marketplace.

This Article uses its models of rights talk and marketplace discourse to analyze how the EU and United States protect their respective data subjects and privacy consumers. Although the differences are great, there is still a path forward. A new set of institutions and processes can play a central role in developing mutually acceptable standards of data privacy. The key documents in this regard are the General Data Protection Regulation, an EU-wide standard that becomes binding in 2018, and the Privacy Shield, an EU–U.S. treaty signed in 2016. These legal standards require regular interactions between the EU and United States and create numerous points for harmonization, coordination, and cooperation. The GDPR and Privacy Shield also establish new kinds of governmental networks to resolve conflicts. The future of international data privacy law rests on the development of new understandings of privacy within these innovative structures….(More)”.

Measuring Tomorrow: Accounting for Well-Being, Resilience, and Sustainability in the Twenty-First Century


Book by Éloi Laurent on “How moving beyond GDP will improve well-being and sustainability…Never before in human history have we produced so much data, and this empirical revolution has shaped economic research and policy profoundly. But are we measuring, and thus managing, the right things—those that will help us solve the real social, economic, political, and environmental challenges of the twenty-first century? In Measuring Tomorrow, Éloi Laurent argues that we need to move away from narrowly useful metrics such as gross domestic product and instead use broader ones that aim at well-being, resilience, and sustainability. By doing so, countries will be able to shift their focus away from infinite and unrealistic growth and toward social justice and quality of life for their citizens.

The time has come for these broader metrics to become more than just descriptive, Laurent argues; applied carefully by private and public decision makers, they can foster genuine progress. He begins by taking stock of the booming field of well-being and sustainability indicators, and explains the insights that the best of these can offer. He then shows how these indicators can be used to develop new policies, from the local to the global….(More)”.

Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers


Leslie Harris at the Future of Privacy Forum: “Data has become the currency of the modern economy. A recent study projects the global volume of data to grow from about 0.8 zettabytes (ZB) in 2009 to more than 35 ZB in 2020, most of it generated within the last two years and held by the corporate sector.

As the cost of data collection and storage becomes cheaper and computing power increases, so does the value of data to the corporate bottom line. Powerful data science techniques, including machine learning and deep learning, make it possible to search, extract and analyze enormous sets of data from many sources in order to uncover novel insights and engage in predictive analysis. Breakthrough computational techniques allow complex analysis of encrypted data, making it possible for researchers to protect individual privacy, while extracting valuable insights.

At the same time, these newfound data sources hold significant promise for advancing scholarship and shaping more impactful social policies, supporting evidence-based policymaking and more robust government statistics, and shaping more impactful social interventions. But because most of this data is held by the private sector, it is rarely available for these purposes, posing what many have argued is a serious impediment to scientific progress.

A variety of reasons have been posited for the reluctance of the corporate sector to share data for academic research. Some have suggested that the private sector doesn’t realize the value of their data for broader social and scientific advancement. Others suggest that companies have no “chief mission” or public obligation to share. But most observers describe the challenge as complex and multifaceted. Companies face a variety of commercial, legal, ethical, and reputational risks that serve as disincentives to sharing data for academic research, with privacy – particularly the risk of reidentification – an intractable concern. For companies, striking the right balance between the commercial and societal value of their data, the privacy interests of their customers, and the interests of academics presents a formidable dilemma.

To be sure, there is evidence that some companies are beginning to share for academic research. For example, a number of pharmaceutical companies are now sharing clinical trial data with researchers, and a number of individual companies have taken steps to make data available as well. What is more, companies are also increasingly providing open or shared data for other important “public good” activities, including international development, humanitarian assistance and better public decision-making. Some are contributing to data collaboratives that pool data from different sources to address societal concerns. Yet, it is still not clear whether and to what extent this “new era of data openness” will accelerate data sharing for academic research.

Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with ResearchersIn this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment….(More)”.

Crowdsourced Smart Cities


Paper by Robert A Iannucci and Anthony Rowe: “The vision of applying computing and communication technologies to enhance life in our cities is fundamentally appealing. Pervasive sensing and computing can alert us to imminent dangers, particularly with respect to the movement of vehicles and pedestrians in and around crowded streets. Signaling systems can integrate knowledge of city-scale traffic congestion. Self-driving vehicles can borrow from and contribute to a city-scale information collaborative. Achieving this vision will require significant coordination among the creators of sensors, actuators, and application-level software systems. Cities will invest in such smart infrastructure if and only if they are convinced that the value can be realized. Investment by technology providers in creation of the infrastructure depends to a large degree on their belief in a broad and ready market. To accelerate innovation, this stalemate must be broken. Borrowing a page from the evolution of the internet, we put forward the notion that an initially minimalist networking infrastructure that is well suited to smart city concepts can break this cycle and empower co-development of both clever city-sensing devices and valuable city-scale applications, with players large and small being empowered in the process. We call this the crowdsourced smart city concept. We illustrate the concept via an examination of our ongoing project to crowdsource real-time traffic data, arguing that this can rapidly generalize to many more smart city applications. This exploration motivates study of a number of smart city challenges, crowdsourced or otherwise, leading to a paradigm shift we call edgeless computing….(More)”.

Ethical questions in data journalism and the power of online discussion


David Craig, Stan Ketterer and Mohammad Yousuf at Data Driven Journalism: “One common element uniting data journalism projects, across different stories and locations, is the ethical challenges they present.

As scholars and practitioners of data journalism have pointed out, main issues include flawed datamisrepresentation from a lack of context, and privacy concerns. Contributors have discussed the ethics of data journalism on this site in posts about topics such as the use of pervasive datatransparency about editorial processes in computational journalism, and best practices for doing data journalism ethically.

Our research project looked at similar ethical challenges by examining journalists’ discussion of the controversial handling of publicly accessible gun permit data in two communities in the United States. The cases are not new now, but the issues they raise persist and point to opportunities – both to learn from online discussion of ethical issues and to ask a wide range of ethical questions about data journalism

The cases

Less than two weeks after the 2012 shooting deaths of 20 children and six staff members at Sandy Hook Elementary School in Newtown, Connecticut, a journalist at The Journal News in White Plains, New York, wrote a story about the possible expansion of publicly accessible gun permit data. The article was accompanied by three online maps with the locations of gun permit holders. The clickable maps of a two-county area in the New York suburbs also included the names and addresses of the gun permit holders. The detailed maps with personal information prompted a public outcry both locally and nationally, mainly involving privacy and safety concerns, and were subsequently taken down.

Although the 2012 case prompted the greatest attention, another New York newspaper reporter’s Freedom of Information request for a gun permit database for three counties sparked an earlier public outcry in 2008. The Glen Falls Post-Star’s editor published an editorial in response. “We here at The Post-Star find ourselves in the unusual position of responding to the concerns of our readers about something that has not even been published in our newspaper or Web site,” the editorial began. The editor said the request “drew great concern from members of gun clubs and people with gun permits in general, a concern we totally understand.”

Both of these cases prompted discussion among journalists, including participants in NICAR-L, the listserv of the National Institute for Computer-Assisted Reporting, whose subscribers include data journalists from major news organizations in the United States and around the world. Our study examined the content of three discussion threads with a total of 119 posts that focused mainly on ethical issues.

Key ethical issues

Several broad ethical issues, and specific themes related to those issues, appeared in the discussion.

1. Freedom versus responsibility and journalistic purpose..

2. Privacy and verification..

3. Consequences..

….(More)”

See also: David Craig, Stan Ketterer and Mohammad Yousuf, “To Post or Not to Post: Online Discussion of Gun Permit Mapping and the Development of Ethical Standards in Data Journalism,” Journalism & Mass Communication Quarterly

Talent to Spare: The Untapped Potential for Attracting, Developing and Retaining Talent as an Intermediary in the Social Impact Sector


Report by the Global Social Entrepreneurship Network (GSEN) and the BMW Foundation Herbert Quandt: “…Both social entrepreneurs and the organisations that support them depend on finding and retaining top talent. Although the social impact sector is growing – with more and more university courses focusing on creating positive impact and an increasingly competitive job market – the sector might soon experience a flow of talented people leaving, frustrated with an unhealthy work-life balance or an underinvestment into culture and talent development. Awareness, action and advocacy are needed now….

Potential Solutions: …To design and implement an inclusive, overarching talent strategy that attracts talent with competitive non-financial compensation, an appealing employer brand and innovative job interviews; develops talent with a range of learning opportunities, transparent policies and need-based structures; and retains talent by cultivating a caring culture, creating awareness of employee well-being and providing clear exit strategies….

The investment made into the individuals that shape the social impact sector will determine the amount of change the sector creates in the future. Openness about talent challenges, peer-to-peer support around talent management and sharing of resources are necessary measures to contextualise the “popularisation of purpose” trend and build a healthy sector….(More)”.

The UN is using ethereum’s technology to fund food for thousands of refugees


Joon Ian Wong at Quartz: “The United Nations agency in charge of food aid—often billed as the largest aid organization in the world—is betting that an ethereum-based blockchain technology could be the key to delivering aid efficiently to refugees while slashing the costs of doing so.

The agency, known as the World Food Programme (WFP), is the rare example of an organization that has delivered tangible results from its blockchain experiments—unlike the big banks that have experimented with the technology for years.

The WFP says it has transferred $1.4 million in food vouchers to 10,500 Syrian refugees in Jordan since May, and it plans to expand. “We need to bring the project from the current capacity to many, many, more,” says Houman Haddad, the WFP executive leading the project. “By that I mean 1 million transactions per day.”

Haddad, in Mexico to speak at the Ethereum Foundation’s annual developer conference, hopes to expand the UN project, called Building Blocks, from providing payment vouchers for one camp to providing vouchers for four camps, covering 100,000 people, by next January. He hopes to attract developers and partners to the UN project from his conference appearance, organized by the foundation, which acts as a steward for the technical development of the ethereum protocol….

The problem of internal bureaucratic warfare, of course, isn’t limited to the UN. Paul Currion, who co-founded Disberse, another blockchain-based aid delivery platform, lauds the speediness of the WFP effort. “It’s fantastic for proving this can work in the field,” he says. But “we’ve found that the hard work is integrating blockchain technology into existing organizational processes—we can’t just hand people a ticket and expect them to get on the high-speed blockchain train; we also need to drive with them to the station,” he says….(More)”.