The ‘Datasphere’, Data Flows Beyond Control, and the Challenges for Law and Governance


Paper by Jean-Sylvestre Bergé, Stephane Grumbach and Vincenzo Zeno-Zencovich: “The flows of people, goods and capital, which have considerably increased in recent history, are leading to crises (e.g., migrants, tax evasion, food safety) which reveal the failure to control them. Much less visible, and not yet included in economic measurements, data flows have increased exponentially in the last two decades, with the digitisation of social and economic activities. A new space – Datasphere – is emerging, mostly supported by digital platforms which provide essential services reaching half of the world’s population directly. Their control over data flows raises new challenges to governance, and increasingly conflicts with public administration.

In this paper, we consider the need and the difficulty of regulating this emerging space and the different approaches followed on both sides of the Atlantic. We distinguish between three situations. We first consider data at rest, which is from the point of view of the location where data are physically stored. We then consider data in motion, and the issues related to their combination. Finally, we investigate data in action, that is data as vectors of command of legal or illegal activities over territories, with impacts on economy and society as well as security, and raise governance challenges.

The notion of ‘Datasphere’ proposes a holistic comprehension of all the ‘information’ existing on earth, originating both in natural and socio-economic systems, which can be captured in digital form, flows through networks, and is stored, processed and transformed by machines. It differs from the ‘Cyberspace’, which is mostly concerned with the networks, the technical instruments (from software and protocols to cables and data centers) together with the social activities it allows, and to what extent they could/should be allowed.

The paper suggests one – out of the many possible – approach to this new world. Clearly it would be impossible to delve in depth into all its facets, which are as many as those of the physical world. Rather, it attempts to present how traditional legal notions could be usefully managed to put order in a highly complex environment, avoiding a piecemeal approach that looks only at details….(More)”.

The Magic of “Multisolving”


Elizabeth Sawin at Stanford Social Innovation Review: “In Japan, manufacturing facilities use “green curtains”—living panels of climbing plants—to clean the air, provide vegetables for company cafeterias, and reduce energy use for cooling. A walk-to-school program in the United Kingdom fights a decline in childhood physical activity while reducing traffic congestion and greenhouse gas emissions from transportation. A food-gleaning program staffed by young volunteers and families facing food insecurity in Spain addresses food waste, hunger, and a desire for sustainability.

Each of these is a real-life example of what I call “multisolving”—where people pool expertise, funding, and political will to solve multiple problems with a single investment of time and money. It’s an approach with great relevance in this era of complex, interlinked, social and environmental challenges. But what’s the best formula for implementing projects that tackle many problems at once?

Climate Interactive, which uses systems analysis to help people address climate change, recently completed a year-long study of multisolving for climate and health. We learned there is no one-size-fits-all recipe, but we did identify three operating principles and three practices that showed up again and again in the projects we studied. What’s more, anyone wanting to access the power of cross-sectoral partnership can adopt them….(More)”.

How Charities Are Using Artificial Intelligence to Boost Impact


Nicole Wallace at the Chronicle of Philanthropy: “The chaos and confusion of conflict often separate family members fleeing for safety. The nonprofit Refunite uses advanced technology to help loved ones reconnect, sometimes across continents and after years of separation.

Refugees register with the service by providing basic information — their name, age, birthplace, clan and subclan, and so forth — along with similar facts about the people they’re trying to find. Powerful algorithms search for possible matches among the more than 1.1 million individuals in the Refunite system. The analytics are further refined using the more than 2,000 searches that the refugees themselves do daily.

The goal: find loved ones or those connected to them who might help in the hunt. Since Refunite introduced the first version of the system in 2010, it has helped more than 40,000 people reconnect.

One factor complicating the work: Cultures define family lineage differently. Refunite co-founder Christopher Mikkelsen confronted this problem when he asked a boy in a refugee camp if he knew where his mother was. “He asked me, ‘Well, what mother do you mean?’ ” Mikkelsen remembers. “And I went, ‘Uh-huh, this is going to be challenging.’ ”

Fortunately, artificial intelligence is well suited to learn and recognize different family patterns. But the technology struggles with some simple things like distinguishing the image of a chicken from that of a car. Mikkelsen believes refugees in camps could offset this weakness by tagging photographs — “car” or “not car” — to help train algorithms. Such work could earn them badly needed cash: The group hopes to set up a system that pays refugees for doing such work.

“To an American, earning $4 a day just isn’t viable as a living,” Mikkelsen says. “But to the global poor, getting an access point to earning this is revolutionizing.”

Another group, Wild Me, a nonprofit created by scientists and technologists, has created an open-source software platform that combines artificial intelligence and image recognition, to identify and track individual animals. Using the system, scientists can better estimate the number of endangered animals and follow them over large expanses without using invasive techniques….

To fight sex trafficking, police officers often go undercover and interact with people trying to buy sex online. Sadly, demand is high, and there are never enough officers.

Enter Seattle Against Slavery. The nonprofit’s tech-savvy volunteers created chatbots designed to disrupt sex trafficking significantly. Using input from trafficking survivors and law-enforcement agencies, the bots can conduct simultaneous conversations with hundreds of people, engaging them in multiple, drawn-out conversations, and arranging rendezvous that don’t materialize. The group hopes to frustrate buyers so much that they give up their hunt for sex online….

A Philadelphia charity is using machine learning to adapt its services to clients’ needs.

Benefits Data Trust helps people enroll for government-assistance programs like food stamps and Medicaid. Since 2005, the group has helped more than 650,000 people access $7 billion in aid.

The nonprofit has data-sharing agreements with jurisdictions to access more than 40 lists of people who likely qualify for government benefits but do not receive them. The charity contacts those who might be eligible and encourages them to call the Benefits Data Trust for help applying….(More)”.

Big Data for the Greater Good


Book edited by Ali Emrouznejad and Vincent Charles: “This book highlights some of the most fascinating current uses, thought-provoking changes, and biggest challenges that Big Data means for our society. The explosive growth of data and advances in Big Data analytics have created a new frontier for innovation, competition, productivity, and well-being in almost every sector of our society, as well as a source of immense economic and societal value. From the derivation of customer feedback-based insights to fraud detection and preserving privacy; better medical treatments; agriculture and food management; and establishing low-voltage networks – many innovations for the greater good can stem from Big Data. Given the insights it provides, this book will be of interest to both researchers in the field of Big Data, and practitioners from various fields who intend to apply Big Data technologies to improve their strategic and operational decision-making processes….(More)”.

City-as-a-Service


Circle Economy: Today during the WeMakeThe.City festival, Circle Economy launched the ‘City-as-a-Service’ publication, which offers a first glimpse into the ‘circular city of the future’. This publication is an initial and practical exploration of how service models will shape the way in which societal needs can be met in a future urban environment and how cities can take a leadership role in a transition towards a circular economy….

Housing, nutrition, mobility, and clothing are primary human needs and directly linked to material extraction. For each of these needs, Circle Economy has examined the potential impacts that service models can have.

By subscribing to a car-sharing service, for example, consumers are able to choose smaller, cheaper and more efficient cars when driving solo. In the Netherlands alone, this would save 2,200 kton of CO2 annually and will reduce annual spending on motoring by 10%. For textiles, a service model could potentially help us avoid “bad buys” that are never worn, which would result in a 15% cost saving for consumers and 23 kton of CO2 in the Netherlands. …

In an increasingly urban world, cities have to play a leading role to drive sustainable transitions and will lead the way on delivering the positive effects of a circular economy – and hence help to close the circularity gap. The circular economy offers a clear roadmap towards realizing the low-carbon, human-centered and prosperous circular city of the future. The ‘City-As-A-Service’ vision is a key next step into this promising future. Ultimately, service models could be a game changer for cities. In fact, city governments can influence this by providing the right boundary conditions and incentives in their policymaking…..(More)”.

4 reasons why Data Collaboratives are key to addressing migration


Stefaan Verhulst and Andrew Young at the Migration Data Portal: “If every era poses its dilemmas, then our current decade will surely be defined by questions over the challenges and opportunities of a surge in migration. The issues in addressing migration safely, humanely, and for the benefit of communities of origin and destination are varied and complex, and today’s public policy practices and tools are not adequate. Increasingly, it is clear, we need not only new solutions but also new, more agile, methods for arriving at solutions.

Data are central to meeting these challenges and to enabling public policy innovation in a variety of ways. Yet, for all of data’s potential to address public challenges, the truth remains that most data generated today are in fact collected by the private sector. These data contains tremendous possible insights and avenues for innovation in how we solve public problems. But because of access restrictions, privacy concerns and often limited data science capacity, their vast potential often goes untapped.

Data Collaboratives offer a way around this limitation.

Data Collaboratives: A new form of Public-Private Partnership for a Data Age

Data Collaboratives are an emerging form of partnership, typically between the private and public sectors, but often also involving civil society groups and the education sector. Now in use across various countries and sectors, from health to agriculture to economic development, they allow for the opening and sharing of information held in the private sector, in the process freeing data silos up to serve public ends.

Although still fledgling, we have begun to see instances of Data Collaboratives implemented toward solving specific challenges within the broad and complex refugee and migrant space. As the examples we describe below suggest (which we examine in more detail Stanford Social Innovation Review), the use of such Collaboratives is geographically dispersed and diffuse; there is an urgent need to pull together a cohesive body of knowledge to more systematically analyze what works, and what doesn’t.

This is something we have started to do at the GovLab. We have analyzed a wide variety of Data Collaborative efforts, across geographies and sectors, with a goal of understanding when and how they are most effective.

The benefits of Data Collaboratives in the migration field

As part of our research, we have identified four main value propositions for the use of Data Collaboratives in addressing different elements of the multi-faceted migration issue. …(More)”,

Citizen-generated evidence for a more sustainable and healthy food system


Research Report by Bill Vorley:  “Evidence generation by and with low-income citizens is particularly important if policy makers are to improve understanding of people’s diets and the food systems they use, in particular the informal economy. The informal food economy is the main route for low-income communities to secure their food, and is an important source of employment, especially for women and youth. The very nature of informality means that the realities of poor people’s lives are often invisible to policymakers. This invisibility is a major factor in exclusion and results in frequent mismatches between policy and local realities. This paper focuses on citizen-generated evidence as a means for defending and improving the food system of the poor. It clearly outlines a range of approaches to citizen-generated evidence including primary data collection and citizen access to and use of existing information….(More)”.

Mapping the economy in real time is almost ‘within our grasp’


Delphine Strauss at the Financial Times: “The goal of mapping economic activity in real time, just as we do for weather or traffic, is “closer than ever to being within our grasp”, according to Andy Haldane, the Bank of England’s chief economist. In recent years, “data has become the new oil . . . and data companies have become the new oil giants”, Mr Haldane told an audience at King’s Business School …

But economics and finance have been “rather reticent about fully embracing this oil-rush”, partly because economists have tended to prefer a deductive approach that puts theory ahead of measurement. This needs to change, he said, because relying too much on either theory or real-world data in isolation can lead to serious mistakes in policymaking — as was seen when the global financial crisis exposed the “empirical fragility” of macroeconomic models.

Parts of the private sector and academia have been far swifter to exploit the vast troves of ever-accumulating data now available — 90 per cent of which has been created in the last two years alone. Massachusetts Institute of Technology’s “Billion Prices Project”, name-checked in Mr Haldane’s speech, now collects enough data from online retailers for its commercial arm to provide daily inflation updates for 22 economies….

The UK’s Office for National Statistics — which has faced heavy criticism over the quality of its data in recent years — is experimenting with “web-scraping” to collect price quotes for food and groceries, for example, and making use of VAT data from small businesses to improve its output-based estimates of gross domestic product. In both cases, the increased sample size and granularity could bring considerable benefits on top of existing surveys, Mr Haldane said.

The BoE itself is trying to make better use of financial data — for example, by using administrative data on owner-occupied mortgages to better understand pricing decisions in the UK housing market. Mr Haldane sees scope to go further with the new data coming on stream on payment, credit and banking flows. …New data sources and techniques could also help policymakers think about human decision-making — which rarely conforms with the rational process assumed in many economic models. Data on music downloads from Spotify, used as an indicator of sentiment, has recently been shown to do at least as well as a standard consumer confidence survey in tracking consumer spending….(More)”.

Prescription drugs that kill: The challenge of identifying deaths in government data


Mike Stucka at Data Driven Journalism: “An editor at The Palm Beach Post printed out hundreds of pages of reports and asked a simple question that turned out to be weirdly complex: How many people were being killed by a prescription drug?

That question relied on version of a report that was soon discontinued by the U.S. Food and Drug Administration. Instead, the agency built a new web site that doesn’t allow exports or the ability to see substantial chunks of the data. So, I went to raw data files that were horribly formatted — and, before the project was over, the FDA had reissued some of those data files and taken most of them offline.

But I didn’t give up hope. Behind the data — known as FAERS, or FDA Adverse Event Reporting System — are more than a decade of data for suspected drug complications of nearly every kind. With multiple drugs in many reports, and multiple versions of many reports, the list of drugs alone comes to some 35 million reports. And it’s a potential gold mine.

How much of a gold mine? For one relatively rare drug, meant only for the worst kind of cancer pain, we found records tying the drug to more than 900 deaths. A salesman had hired a former exotic dancer and a former Playboy model to help sell the drug known as Subsys. He then pushed salesmen to up the dosage, John Pacenti and Holly Baltz found in their package, “Pay To Prescribe? The Fentanyl Scandal.”

FAERS has some serious limitations, but some serious benefits. The data can tell you why a drug was prescribed; it can tell you if a person was hospitalized because of a drug reaction, or killed, or permanently disabled. It can tell you what country the report came from. It’s got the patient age. It’s got the date of reporting. It’s got other drugs involved. Dosage. There’s a ton of useful information.

Now the bad stuff: There may be multiple reports for each actual case, as well as multiple versions of a single “case” ID….(More)”

Citizenship and democratic production


Article by Mara Balestrini and Valeria Right in Open Democracy: “In the last decades we have seen how the concept of innovation has changed, as not only the ecosystem of innovation-producing agents, but also the ways in which innovation is produced have expanded. The concept of producer-innovation, for example, where companies innovate on the basis of self-generated ideas, has been superseded by the concept of user-innovation, where innovation originates from the observation of the consumers’ needs, and then by the concept of consumer-innovation, where consumers enhanced by the new technologies are themselves able to create their own products. Innovation-related business models have changed too. We now talk about not only patent-protected innovation, but also open innovation and even free innovation, where open knowledge sharing plays a key role.

A similar evolution has taken place in the field of the smart city. While the first smart city models prioritized technology left in the hands of experts as a key factor for solving urban problems, more recent initiatives such as Sharing City (Seoul), Co-city (Bologna), or Fab City (Barcelona) focus on citizen participation, open data economics and collaborative-distributed processes as catalysts for innovative solutions to urban challenges. These initiatives could prompt a new wave in the design of more inclusive and sustainable cities by challenging existing power structures, amplifying the range of solutions to urban problems and, possibly, creating value on a larger scale.

In a context of economic austerity and massive urbanization, public administrations are acknowledging the need to seek innovative alternatives to increasing urban demands. Meanwhile, citizens, harnessing the potential of technologies – many of them accessible through open licenses – are putting their creative capacity into practice and contributing to a wave of innovation that could reinvent even the most established sectors.

Contributive production

The virtuous combination of citizen participation and abilities, digital technologies, and open and collaborative strategies is catalyzing innovation in all areas. Citizen innovation encompasses everything, from work and housing to food and health. The scope of work, for example, is potentially affected by the new processes of manufacturing and production on an individual scale: citizens can now produce small and large objects (new capacity), thanks to easy access to new technologies such as 3D printers (new element); they can also take advantage of new intellectual property licenses by adapting innovations from others and freely sharing their own (new rule) in response to a wide range of needs.

Along these lines, between 2015 and 2016, the city of Bristol launched a citizen innovation program aimed at solving problems related to the state of rented homes, which produced solutions through citizen participation and the use of sensors and open data. Citizens designed and produced themselves temperature and humidity sensors – using open hardware (Raspberry Pi), 3D printers and laser cutters – to combat problems related to home damp. These sensors, placed in the homes, allowed to map the scale of the problem, to differentiate between condensation and humidity, and thus to understand if the problem was due to structural failures of the buildings or to bad habits of the tenants. Through the inclusion of affected citizens, the community felt empowered to contribute ideas towards solutions to its problems, together with the landlords and the City Council.

A similar process is currently being undertaken in Amsterdam, Barcelona and Pristina under the umbrella of the Making Sense Project. In this case, citizens affected by environmental issues are producing their own sensors and urban devices to collect open data about the city and organizing collective action and awareness interventions….

Digital social innovation is disrupting the field of health too. There are different manifestations of these processes. First, platforms such as DataDonors or PatientsLikeMe show that there is an increasing citizen participation in biomedical research through the donation of their own health data…. projects such as OpenCare in Milan and mobile applications like Good Sam show how citizens can organize themselves to provide medical services that otherwise would be very costly or at a scale and granularity that the public sector could hardly afford….

The production processes of these products and services force us to think about their political implications and the role of public institutions, as they question the cities’ existing participation and contribution rules. In times of sociopolitical turbulence and austerity plans such as these, there is a need to design and test new approaches to civic participation, production and management which can strengthen democracy, add value and take into account the aspirations, emotional intelligence and agency of both individuals and communities.

In order for the new wave of citizen production to generate social capital, inclusive innovation and well-being, it is necessary to ensure that all citizens, particularly those from less-represented communities, are empowered to contribute and participate in the design of cities-for-all. It is therefore essential to develop programs to increase citizen access to the new technologies and the acquisition of the knowhow and skills needed to use and transform them….(More)

This piece is an excerpt from an original article published as part of the eBook El ecosistema de la Democracia Abierta.