We should extend EU bank data sharing to all sectors


Carlos Torres Vila in the Financial Times: “Data is now driving the global economy — just look at the list of the world’s most valuable companies. They collect and exploit the information that users generate through billions of online interactions taking place every day. 


But companies are hoarding data too, preventing others, including the users to whom the data relates, from accessing and using it. This is true of traditional groups such as banks, telcos and utilities, as well as the large digital enterprises that rely on “proprietary” data. 
Global and national regulators must address this problem by forcing companies to give users an easy way to share their own data, if they so choose. This is the logical consequence of personal data belonging to users. There is also the potential for enormous socio-economic benefits if we can create consent-based free data flows. 
We need data-sharing across companies in all sectors in a real time, standardised way — not at a speed and in a format dictated by the companies that stockpile user data. These new rules should apply to all electronic data generated by users, whether provided directly or observed during their online interactions with any provider, across geographic borders and in any sector. This could include everything from geolocation history and electricity consumption to recent web searches, pension information or even most recently played songs. 

This won’t be easy to achieve in practice, but the good news is that we already have a framework that could be the model for a broader solution. The UK’s Open Banking system provides a tantalising glimpse of what may be possible. In Europe, the regulation known as the Payment Services Directive 2 allows banking customers to share data about their transactions with multiple providers via secure, structured IT interfaces. We are already seeing this unlock new business models and drive competition in digital financial services. But these rules do not go far enough — they only apply to payments history, and that isn’t enough to push forward a data-driven economic revolution across other sectors of the economy. 

We need a global framework with common rules across regions and sectors. This has already happened in financial services: after the 2008 financial crisis, the G20 strengthened global banking standards and created the Financial Stability Board. The rules, while not perfect, have delivered uniformity which has strengthened the system. 

We need a similar global push for common rules on the use of data. While it will be difficult to achieve consensus on data, and undoubtedly more difficult still to implement and enforce it, I believe that now is the time to decide what we want. The involvement of the G20 in setting up global standards will be essential to realising the potential that data has to deliver a better world for all of us. There will be complaints about the cost of implementation. I know first hand how expensive it can be to simultaneously open up and protect sensitive core systems. 

The alternative is siloed data that holds back innovation. There will also be justified concerns that easier data sharing could lead to new user risks. Security must be a non-negotiable principle in designing intercompany interfaces and protecting access to sensitive data. But Open Banking shows that these challenges are resolvable. …(More)”.

The Landscape of Open Data Policies


Apograf: “Open Access (OA) publishing has a long history, going back to the early 1990s, and was born with the explicit intention of improving access to scholarly literature. The internet has played a pivotal role in garnering support for free and reusable research publications, as well as stronger and more democratic peer-review systems — ones are not bogged down by the restrictions of influential publishing platforms….

Looking back, looking forward

Launched in 1991, ArXiv.org was a pioneering platform in this regard, a telling example of how researchers could cooperate to publish academic papers for free and in full view for the public. Though it has limitations — papers are curated by moderators and are not peer-reviewed — arXiv is a demonstration of how technology can be used to overcome some of the incentive and distribution problems that scientific research had long been subjected to.

The scientific community has itself assumed the mantle to this end: the Budapest Open Access Initiative (BOAI) and the Berlin Declaration on Open Access Initiative, launched in 2002 and 2003 respectively, are considered landmark movements in the push for unrestricted access to scientific research. While mostly symbolic, the effort highlighted the growing desire to solve the problems plaguing the space through technology.

The BOAI manifesto begins with a statement that is an encapsulation of the movement’s purpose,

“An old tradition and a new technology have converged to make possible an unprecedented public good. The old tradition is the willingness of scientists and scholars to publish the fruits of their research in scholarly journals without payment, for the sake of inquiry and knowledge. The new technology is the internet. The public good they make possible is the world-wide electronic distribution of the peer-reviewed journal literature and completely free and unrestricted access to it by all scientists, scholars, teachers, students, and other curious minds.”

Plan S is a more recent attempt to make publicly funded research available to all. Launched by Science Europe in September 2018, Plan S — short for ‘Shock’ — has energized the research community with its resolution to make access to publicly funded knowledge a right to everyone and dissolve the profit-driven ecosystem of research publication. Members of the European Union have vowed to achieve this by 2020.

Plan S has been supported by governments outside Europe as well. China has thrown itself behind it, and the state of California has enacted a law that requires open access to research one year after publishing. It is, of course, not without its challenges: advocacy and ensuring that publishing is not restricted a few venues are two such obstacles. However, the organization behind forming the guidelines, cOAlition S, has agreed to make the guidelines more flexible.

The emergence of this trend is not without its difficulties, however, and numerous obstacles continue to hinder the dissemination of information in a manner that is truly transparent and public. Chief among these are the many gates that continue to keep research as somewhat of exclusive property, besides the fact that the infrastructure and development for such systems are short on funding and staff…..(More)”.

Journalism Initiative Crowdsources Feedback on Failed Foreign Aid Projects


Abigail Higgins at SSIR: “It isn’t unusual that a girl raped in northeastern Kenya would be ignored by law enforcement. But for Mary, whose name has been changed to protect her identity, it should have been different—NGOs had established a hotline to report sexual violence just a few years earlier to help girls like her get justice. Even though the hotline was backed by major aid institutions like Mercy Corps and the British government, calls to it regularly went unanswered.

“That was the story that really affected me. It touched me in terms of how aid failures could impact someone,” says Anthony Langat, a Nairobi-based reporter who investigated the hotline as part of a citizen journalism initiative called What Went Wrong? that examines failed foreign aid projects.

Over six months in 2018, What Went Wrong? collected 142 reports of failed aid projects in Kenya, each submitted over the phone or via social media by the very people the project was supposed to benefit. It’s a move intended to help upend the way foreign aid is disbursed and debated. Although aid organizations spend significant time evaluating whether or not aid works, beneficiaries are often excluded from that process.

“There’s a serious power imbalance,” says Peter DiCampo, the photojournalist behind the initiative. “The people receiving foreign aid generally do not have much say. They don’t get to choose which intervention they want, which one would feel most beneficial for them. Our goal is to help these conversations happen … to put power into the hands of the people receiving foreign aid.”

What Went Wrong? documented eight failed projects in an investigative series published by Devex in March. In Kibera, one of Kenya’s largest slums, public restrooms meant to improve sanitation failed to connect to water and sewage infrastructure and were later repurposed as churches. In another story, the World Bank and local thugs struggled for control over the slum’s electrical grid….(More)”

Getting serious about value


Paper by Mariana Mazzucato and Rainer Kattel: “Public value is value that is created collectively for a public purpose. This requires understanding of how public institutions can engage citizens in defining purpose (participatory structures), nurture organisational capabilities and capacity to shape new opportunities (organisational competencies); dynamically assess the value created (dynamic evaluation); and ensure that societal value is distributed equitably (inclusive growth).Rainer KattelMariana Mazzucato and Public value is value that is created collectively for a public purpose. This requires understanding of how public institutions can engage citizens in defining purpose (participatory structures), nurture organisational capabilities and capacity to shape new opportunities (organisational competencies); dynamically assess the value created (dynamic evaluation); and ensure that societal value is distributed equitably (inclusive growth).

Purpose-driven capitalism requires more than just words and gestures of goodwill. It requires purpose to be put at the centre of how companies and governments are run and how they interact with civil society.

Keynes claimed that practitioners who thought they were just getting the ‘job done’ were slaves of defunct economic theory.1 Purposeful capitalism, if it is to happen on the ground for real, requires a rethinking of value in economic theory and how it has shaped actions.

Today’s dominant economics framework restricts its understanding of value to a theory of exchange; only that which has a price is valuable. ‘Collective’ effort is missed since it is only individual decisions that matter:
even wages are seen as outcomes of an individual’s choice (maximisation of utility) between leisure versus work. ‘Social value’ itself is limited to looking at economic ‘welfare’ principles; that is, aggregate outcomes from individual behaviours…(More)”

The European Lead Factory: Collective intelligence and cooperation to improve patients’ lives


Press Release: “While researchers from small and medium-sized companies and academic institutions often have enormous numbers of ideas, they don’t always have enough time or resources to develop them all. As a result, many ideas get left behind because companies and academics typically have to focus on narrow areas of research. This is known as the “Innovation Gap”. ESCulab (European Screening Centre: unique library for attractive biology) aims to turn this problem into an opportunity by creating a comprehensive library of high-quality compounds. This will serve as a basis for testing potential research targets against a wide variety of compounds.

Any researcher from a European academic institution or a small to medium-sized enterprise within the consortium can apply for a screening of their potential drug target. If a submitted target idea is positively assessed by a committee of experts it will be run through a screening process and the submitting party will receive a dossier of up to 50 potentially relevant substances that can serve as starting points for further drug discovery activities.

ESCulab will build Europe’s largest collaborative drug discovery platform and is equipped with a total budget of € 36.5 million: Half is provided by the European Union’s Innovative Medicines Initiative (IMI) and half comes from in-kind contributions from companies of the European Federation of Pharmaceutical Industries an Associations (EFPIA) and the Medicines for Malaria Venture. It builds on the existing library of the European Lead Factory , which consists of around 200,000 compounds, as well as around 350,000 compounds from EFPIA companies. The European Lead Factory aims to initiate 185 new drug discovery projects through the ESCulab project by screening drug targets against its library.

… The platform has already provided a major boost for drug discovery in Europe and is a strong example of how crowdsourcing, collective intelligence and the cooperation within the IMI framework can create real value for academia, industry, society and patients….(More)”

A crisis of legitimacy


Blair Sheppard and Ceri-Ann Droog at Strategy and Business: “For the last 70 years the world has done remarkably well. According to the World Bank, the number of people living in extreme poverty today is less than it was in 1820, even though the world population is seven times as large. This is a truly remarkable achievement, and it goes hand in hand with equally remarkable overall advances in wealth, scientific progress, human longevity, and quality of life.

But the organizations that created these triumphs — the most prominent businesses, governments, and multilateral institutions of the post–World War II era — have failed to keep their implicit promises. As a result, today’s leading organizations face a global crisis of legitimacy. For the first time in decades, their influence, and even their right to exist, are being questioned.

Businesses are also being held accountable in new ways for the welfare, prosperity, and health of the communities around them and of the general public. Our own global firm, PwC, is among these businesses. The accusations facing any individual enterprise may or may not be justified, but the broader attitudes underlying them must be taken seriously.

The causes of this crisis of legitimacy have to do with five basic challenges affecting every part of the world:

  • Asymmetry: Wealth disparity and the erosion of the middle class
  • Disruption: Abrupt technological changes and their destructive effects
  • Age: Demographic pressures as the average life span of human beings increases and the birth rate falls
  • Populism: Growing populism and rejection of the status quo, with associated nationalism and global fracturing
  • Trust: Declining confidence in the prevailing institutions that make our systems work.

(We use the acronym ADAPT to list these challenges because it evokes the inherent change in our time and the need for institutions to respond with new attitudes and behaviors.)

Source: strategy-business.com/ADAPT

A few other challenges, such as climate change and human rights issues, may occur to you as equally important. They are not included in this list because they are not at the forefront of this particular crisis of legitimacy in the same way. But they are affected by it; if leading businesses and global institutions lose their perceived value, it will be harder to address every other issue affecting the world today.

Ignoring the crisis of legitimacy is not an option — not even for business leaders who feel their primary responsibility is to their shareholders. If we postpone solutions too long, we could go past the point of no return: The cost of solving these problems will be too high. Brexit could be a test case. The costs and difficulties of withdrawal could be echoed in other political breakdowns around the world. And if you don’t believe that widespread economic and political disruption is possible right now, then consider the other revolutions and abrupt, dramatic changes in sovereignty that have occurred in the last 250 years, often with technological shifts and widespread dissatisfaction as key factors….(More)”.

107 Years Later, The Titanic Sinking Helps Train Problem-Solving AI


Kiona N. Smith at Forbes: “What could the 107-year-old tragedy of the Titanic possibly have to do with modern problems like sustainable agriculture, human trafficking, or health insurance premiums? Data turns out to be the common thread. The modern world, for better or or worse, increasingly turns to algorithms to look for patterns in the data and and make predictions based on those patterns. And the basic methods are the same whether the question they’re trying to answer is “Would this person survive the Titanic sinking?” or “What are the most likely routes for human trafficking?”

An Enduring Problem

Predicting survival at sea based on the Titanic dataset is a standard practice problem for aspiring data scientists and programmers. Here’s the basic challenge: feed your algorithm a portion of the Titanic passenger list, which includes some basic variables describing each passenger and their fate. From that data, the algorithm (if you’ve programmed it well) should be able to draw some conclusions about which variables made a person more likely to live or die on that cold April night in 1912. To test its success, you then give the algorithm the rest of the passenger list (minus the outcomes) and see how well it predicts their fates.

Online communities like Kaggle.com have held competitions to see who can develop the algorithm that predicts survival most accurately, and it’s also a common problem presented to university classes. The passenger list is big enough to be useful, but small enough to be manageable for beginners. There’s a simple set out of outcomes — life or death — and around a dozen variables to work with, so the problem is simple enough for beginners to tackle but just complex enough to be interesting. And because the Titanic’s story is so famous, even more than a century later, the problem still resonates.

“It’s interesting to see that even in such a simple problem as the Titanic, there are nuggets,” said Sagie Davidovich, Co-Founder & CEO of SparkBeyond, who used the Titanic problem as an early test for SparkBeyond’s AI platform and still uses it as a way to demonstrate the technology to prospective customers….(More)”.

A Taxonomy of Definitions for the Health Data Ecosystem


Announcement: “Healthcare technologies are rapidly evolving, producing new data sources, data types, and data uses, which precipitate more rapid and complex data sharing. Novel technologies—such as artificial intelligence tools and new internet of things (IOT) devices and services—are providing benefits to patients, doctors, and researchers. Data-driven products and services are deepening patients’ and consumers’ engagement and helping to improve health outcomes. Understanding the evolving health data ecosystem presents new challenges for policymakers and industry. There is an increasing need to better understand and document the stakeholders, the emerging data types and their uses.

The Future of Privacy Forum (FPF) and the Information Accountability Foundation (IAF) partnered to form the FPF-IAF Joint Health Initiative in 2018. Today, the Initiative is releasing A Taxonomy of Definitions for the Health Data Ecosystem; the publication is intended to enable a more nuanced, accurate, and common understanding of the current state of the health data ecosystem. The Taxonomy outlines the established and emerging language of the health data ecosystem. The Taxonomy includes definitions of:

  • The stakeholders currently involved in the health data ecosystem and examples of each;
  • The common and emerging data types that are being collected, used, and shared across the health data ecosystem;
  • The purposes for which data types are used in the health data ecosystem; and
  • The types of actions that are now being performed and which we anticipate will be performed on datasets as the ecosystem evolves and expands.

This report is as an educational resource that will enable a deeper understanding of the current landscape of stakeholders and data types….(More)”.

Can tracking people through phone-call data improve lives?


Amy Maxmen in Nature: “After an earthquake tore through Haiti in 2010, killing more than 100,000 people, aid agencies spread across the country to work out where the survivors had fled. But Linus Bengtsson, a graduate student studying global health at the Karolinska Institute in Stockholm, thought he could answer the question from afar. Many Haitians would be using their mobile phones, he reasoned, and those calls would pass through phone towers, which could allow researchers to approximate people’s locations. Bengtsson persuaded Digicel, the biggest phone company in Haiti, to share data from millions of call records from before and after the quake. Digicel replaced the names and phone numbers of callers with random numbers to protect their privacy.

Bengtsson’s idea worked. The analysis wasn’t completed or verified quickly enough to help people in Haiti at the time, but in 2012, he and his collaborators reported that the population of Haiti’s capital, Port-au-Prince, dipped by almost one-quarter soon after the quake, and slowly rose over the next 11 months1. That result aligned with an intensive, on-the-ground survey conducted by the United Nations.

Humanitarians and researchers were thrilled. Telecommunications companies scrutinize call-detail records to learn about customers’ locations and phone habits and improve their services. Researchers suddenly realized that this sort of information might help them to improve lives. Even basic population statistics are murky in low-income countries where expensive household surveys are infrequent, and where many people don’t have smartphones, credit cards and other technologies that leave behind a digital trail, making remote-tracking methods used in richer countries too patchy to be useful.

Since the earthquake, scientists working under the rubric of ‘data for good’ have analysed calls from tens of millions of phone owners in Pakistan, Bangladesh, Kenya and at least two dozen other low- and middle-income nations. Humanitarian groups say that they’ve used the results to deliver aid. And researchers have combined call records with other information to try to predict how infectious diseases travel, and to pinpoint locations of poverty, social isolation, violence and more (see ‘Phone calls for good’)….(More)”.

The Geopolitics of Information


Paper by Eric Rosenbach and Katherine Mansted: “Information is now the world’s most consequential and contested geopolitical resource. The world’s most profitable businesses have asserted for years that data is the “new oil.” Political campaigns—and foreign intelligence operatives—have shown over the past two American presidential elections that data-driven social media is the key to public opinion. Leading scientists and technologists understand that good datasets, not just algorithms, will give them a competitive edge.

Data-driven innovation is not only disrupting economies and societies; it is reshaping relations between nations. The pursuit of information power—involving states’ ability to use information to influence, decide, create and communicate—is causing states to rewrite their terms of engagement with markets and citizens, and to redefine national interests and strategic priorities. In short, information power is altering the nature and behavior of the fundamental building block of international relations, the state, with potentially seismic consequences.

Authoritarian governments recognize the strategic importance of information and over the past five years have operationalized powerful domestic and international information strategies. They are cauterizing their domestic information environments and shutting off their citizens from global information flows, while weaponizing information to attack and destabilize democracies. In particular, China and Russia believe that strategic competition in the 21st century is characterized by a zero-sum contest for control of data, as well as the technology and talent needed to convert data into useful information.

Democracies remain fundamentally unprepared for strategic competition in the Information Age. For the United States in particular, as the importance of information as a geopolitical resource has waxed, its information dominance has waned. Since the end of the Cold War, America’s supremacy in information technologies seemed unassailable—not least because of its central role in creating the Internet and overall economic primacy. Democracies have also considered any type of information strategy to be largely unneeded: government involvement in the domestic information environment feels Orwellian, while democracies believed that their “inherently benign” foreign policy didn’t need extensive influence operations.

However, to compete and thrive in the 21st century, democracies, and the United States in particular, must develop new national security and economic strategies that address the geopolitics of information. In the 20th century, market capitalist democracies geared infrastructure, energy, trade, and even social policy to protect and advance that era’s key source of power—manufacturing. In this century, democracies must better account for information geopolitics across all dimensions of domestic policy and national strategy….(More)”.