Democratizing data in a 5G world


Blog by Dimitrios Dosis at Mastercard: “The next generation of mobile technology has arrived, and it’s more powerful than anything we’ve experienced before. 5G can move data faster, with little delay — in fact, with 5G, you could’ve downloaded a movie in the time you’ve read this far. 5G will also create a vast network of connected machines. The Internet of Things will finally deliver on its promise to fuse all our smart products — vehicles, appliances, personal devices — into a single streamlined ecosystem.

My smartwatch could monitor my blood pressure and schedule a doctor’s appointment, while my car could collect data on how I drive and how much gas I use while behind the wheel. In some cities, petrol trucks already act as roving gas stations, receiving pings when cars are low on gas and refueling them as needed, wherever they are.

This amounts to an incredible proliferation of data. By 2025, every connected person will conduct nearly 5,000 data interactions every day — one every 18 seconds — whether they know it or not. 

Enticing and convenient as new 5G-powered developments may be, it also raises complex questions about data. Namely, who is privy to our personal information? As your smart refrigerator records the foods you buy, will the refrigerator’s manufacturer be able to see your eating habits? Could it sell that information to a consumer food product company for market research without your knowledge? And where would the information go from there? 

People are already asking critical questions about data privacy. In fact, 72% of them say they are paying attention to how companies collect and use their data, according to a global survey released last year by the Harvard Business Review Analytic Services. The survey, sponsored by Mastercard, also found that while 60% of executives believed consumers think the value they get in exchange for sharing their data is worthwhile, only 44% of consumers actually felt that way.

There are many reasons for this data disconnect, including the lack of transparency that currently exists in data sharing and the tension between an individual’s need for privacy and his or her desire for personalization.

This paradox can be solved by putting data in the hands of the people who create it — giving consumers the ability to manage, control and share their own personal information when they want to, with whom they want to, and in a way that benefits them.

That’s the basis of Mastercard’s core set of principles regarding data responsibility – and in this 5G world, it’s more important than ever. We will be able to gain from these new technologies, but this change must come with trust and user control at its core. The data ecosystem needs to evolve from schemes dominated by third parties, where some data brokers collect inferred, often unreliable and inaccurate data, then share it without the consumer’s knowledge….(More)”.

From Tech Critique to Ways of Living


Alan Jacobs at The New Atlantis: “Neil Postman was right. So what? In the 1950s and 1960s, a series of thinkers, beginning with Jacques Ellul and Marshall McLuhan, began to describe the anatomy of our technological society. Then, starting in the 1970s, a generation emerged who articulated a detailed critique of that society. The critique produced by these figures I refer to in the singular because it shares core features, if not a common vocabulary. What Ivan Illich, Ursula Franklin, Albert Borgmann, and a few others have said about technology is powerful, incisive, and remarkably coherent. I am going to call the argument they share the Standard Critique of Technology, or SCT. The one problem with the SCT is that it has had no success in reversing, or even slowing, the momentum of our society’s move toward what one of their number, Neil Postman, called technopoly.

The basic argument of the SCT goes like this. We live in a technopoly, a society in which powerful technologies come to dominate the people they are supposed to serve, and reshape us in their image. These technologies, therefore, might be called prescriptive (to use Franklin’s term) or manipulatory (to use Illich’s). For example, social networks promise to forge connections — but they also encourage mob rule. Facial-recognition software helps to identify suspects — and to keep tabs on whole populations. Collectively, these technologies constitute the device paradigm (Borgmann), which in turn produces a culture of compliance (Franklin).

The proper response to this situation is not to shun technology itself, for human beings are intrinsically and necessarily users of tools. Rather, it is to find and use technologies that, instead of manipulating us, serve sound human ends and the focal practices (Borgmann) that embody those ends. A table becomes a center for family life; a musical instrument skillfully played enlivens those around it. Those healthier technologies might be referred to as holistic (Franklin) or convivial (Illich), because they fit within the human lifeworld and enhance our relations with one another. Our task, then, is to discern these tendencies or affordances of our technologies and, on both social and personal levels, choose the holistic, convivial ones.

The Standard Critique of Technology as thus described is cogent and correct. I have referred to it many times and applied it to many different situations. For instance, I have used the logic of the SCT to make a case for rejecting the “walled gardens” of the massive social media companies, and for replacing them with a cultivation of the “digital commons” of the open web.

But the number of people who are even open to following this logic is vanishingly small. For all its cogency, the SCT is utterly powerless to slow our technosocial momentum, much less to alter its direction. Since Postman and the rest made that critique, the social order has rushed ever faster toward a complete and uncritical embrace of the prescriptive, manipulatory technologies deceitfully presented to us as Liberation and Empowerment. So what next?…(More)”.

Give more data, awareness and control to individual citizens, and they will help COVID-19 containment


Paper by Mirco Nanni et al: “The rapid dynamics of COVID-19 calls for quick and effective tracking of virus transmission chains and early detection of outbreaks, especially in the “phase 2” of the pandemic, when lockdown and other restriction measures are progressively withdrawn, in order to avoid or minimize contagion resurgence. For this purpose, contact-tracing apps are being proposed for large scale adoption by many countries. A centralized approach, where data sensed by the app are all sent to a nation-wide server, raises concerns about citizens’ privacy and needlessly strong digital surveillance, thus alerting us to the need to minimize personal data collection and avoiding location tracking. We advocate the conceptual advantage of a decentralized approach, where both contact and location data are collected exclusively in individual citizens’ “personal data stores”, to be shared separately and selectively (e.g., with a backend system, but possibly also with other citizens), voluntarily, only when the citizen has tested positive for COVID-19, and with a privacy preserving level of granularity. This approach better protects the personal sphere of citizens and affords multiple benefits: it allows for detailed information gathering for infected people in a privacy-preserving fashion; and, in turn this enables both contact tracing, and, the early detection of outbreak hotspots on more finely-granulated geographic scale. The decentralized approach is also scalable to large populations, in that only the data of positive patients need be handled at a central level. Our recommendation is two-fold. First to extend existing decentralized architectures with a light touch, in order to manage the collection of location data locally on the device, and allow the user to share spatio-temporal aggregates—if and when they want and for specific aims—with health authorities, for instance. Second, we favour a longer-term pursuit of realizing a Personal Data Store vision, giving users the opportunity to contribute to collective good in the measure they want, enhancing self-awareness, and cultivating collective efforts for rebuilding society….(More)”.

The problem with prediction


Article by Joseph Fridman: “…At precisely the same moment in which the idea of predictive control has risen to dominance within the corporate sphere, it’s also gained a remarkable following within cognitive science. According to an increasingly influential school of neuroscientists, who orient themselves around the idea of the ‘predictive brain’, the essential activity of our most important organ is to produce a constant stream of predictions: predictions about the noises we’ll hear, the sensations we’ll feel, the objects we’ll perceive, the actions we’ll perform and the consequences that will follow. Taken together, these expectations weave the tapestry of our reality – in other words, our guesses about what we’ll see in the world become the world we see. Almost 400 years ago, with the dictum ‘I think, therefore I am,’ René Descartes claimed that cognition was the foundation of the human condition. Today, prediction has taken its place. As the cognitive scientist Anil Seth put it: ‘I predict (myself) therefore I am.’

Somehow, the logic we find animating our bodies is the same one transforming our body politic. The prediction engine – the conceptual tool used by today’s leading brain scientists to understand the deepest essence of our humanity – is also the one wielded by today’s most powerful corporations and governments. How did this happen and what does it mean?

One explanation for this odd convergence emerges from a wider historical tendency: humans have often understood the nervous system in terms of the flourishing technologies of their era, as the scientist and historian Matthew Cobb explained in The Idea of the Brain (2020). Thomas Hobbes, in his book Leviathan (1651), likened human bodies to ‘automata’, ‘[e]ngines that move themselves by springs and wheeles as doth a watch’. What is the heart, Hobbes asked, if not ‘a Spring; and the Nerves, but so many Strings …?’ Similarly, Descartes described animal spirits moving through the nerves according to the same physical properties that animated the hydraulic machines he witnessed on display in the French royal gardens.

The rise of electronic communications systems accelerated this trend. In the middle of the 19th century, the surgeon and chemist Alfred Smee said the brain was made up of batteries and photovoltaic circuits, allowing the nervous system to conduct ‘electro-telegraphic communication’ with the body. Towards the turn of the 20th century, the neuroscientist Santiago Ramón y Cajal described the positioning of different neural structures ‘somewhat as a telegraph pole supports the conducting wire’. And, during the First World War, the British Royal Institution Christmas lectures featured the anatomist and anthropologist Arthur Keith, who compared brain cells to operators in a telephonic exchange.

The technologies that have come to dominate many of our lives today are not primarily hydraulic or photovoltaic, or even telephonic or electro-telegraphic. They’re not even computational in any simplistic sense. They are predictive, and their infrastructures construct and constrain behaviour in all spheres of life. The old layers remain – electrical wiring innervates homes and workplaces, and water flows into sinks and showers through plumbing hidden from view. But these infrastructures are now governed by predictive technologies, and they don’t just guide the delivery of materials, but of information. Predictive models construct the feeds we scroll; they autocomplete our texts and emails, prompt us to leave for work on time, and pick out the playlists we listen to on the commute that they’ve plotted out for us. Consequential decisions in law enforcement, military and financial contexts are increasingly influenced by automated assessments spat out by proprietary predictive engines….(More)”.

Spatial information and the legibility of urban form: Big data in urban morphology


Paper by Geoff Boeing: “Urban planning and morphology have relied on analytical cartography and visual communication tools for centuries to illustrate spatial patterns, conceptualize proposed designs, compare alternatives, and engage the public. Classic urban form visualizations – from Giambattista Nolli’s ichnographic maps of Rome to Allan Jacobs’s figure-ground diagrams of city streets – have compressed physical urban complexity into easily comprehensible information artifacts. Today we can enhance these traditional workflows through the Smart Cities paradigm of understanding cities via user-generated content and harvested data in an information management context. New spatial technology platforms and big data offer new lenses to understand, evaluate, monitor, and manage urban form and evolution. This paper builds on the theoretical framework of visual cultures in urban planning and morphology to introduce and situate computational data science processes for exploring urban fabric patterns and spatial order. It demonstrates these workflows with OSMnx and data from OpenStreetMap, a collaborative spatial information system and mapping platform, to examine street network patterns, orientations, and configurations in different study sites around the world, considering what these reveal about the urban fabric. The age of ubiquitous urban data and computational toolkits opens up a new era of worldwide urban form analysis from integrated quantitative and qualitative perspectives….(More)”.

Governance models for redistribution of data value


Essay by Maria Savona: “The growth of interest in personal data has been unprecedented. Issues of privacy violation, power abuse, practices of electoral behaviour manipulation unveiled in the Cambridge Analytica scandal, and a sense of imminent impingement of our democracies are at the forefront of policy debates. Yet, these concerns seem to overlook the issue of concentration of equity value (stemming from data value, which I use interchangeably here) that underpins the current structure of big tech business models. Whilst these quasi-monopolies own the digital infrastructure, they do not own the personal data that provide the raw material for data analytics. 

The European Commission has been at the forefront of global action to promote convergence of the governance of data (privacy), including, but not limited to, the General Data Protection Regulation (GDPR) (European Commission 2016), enforced in May 2018. Attempts to enforce similar regulations are emerging around the world, including the California Consumer Privacy Act, which came into effect on 1 January 2020. Notwithstanding greater awareness among citizens around the use of their data, companies find that complying with GDPR is, at best, a useless nuisance. 

Data have been seen as ‘innovation investment’ since the beginning of the 1990s. The first edition of the Oslo Manual, the OECD’s international guidelines for collecting and using data on innovation in firms, dates back to 19921 and included the collection of databases on employee best practices as innovation investments. Data are also measured as an ‘intangible asset’ (Corrado et al. 2009 was one of the pioneering studies). What has changed over the last decade? The scale of data generation today is such that its management and control might have already gone well beyond the capacity of the very tech giants we are all feeding. Concerns around data governance and data privacy might be too little and too late. 

In this column, I argue that economists have failed twice: first, to predict the massive concentration of data value in the hands of large platforms; and second, to account for the complexity of the political economy aspects of data accumulation. Based on a pair of recent papers (Savona 2019a, 2019b), I systematise recent research and propose a novel data rights approach to redistribute data value whilst not undermining the range of ethical, legal, and governance challenges that this poses….(More)”.

From satisficing to artificing: The evolution of administrative decision-making in the age of the algorithm


Paper by Thea Snow at Data & Policy: “Algorithmic decision tools (ADTs) are being introduced into public sector organizations to support more accurate and consistent decision-making. Whether they succeed turns, in large part, on how administrators use these tools. This is one of the first empirical studies to explore how ADTs are being used by Street Level Bureaucrats (SLBs). The author develops an original conceptual framework and uses in-depth interviews to explore whether SLBs are ignoring ADTs (algorithm aversion); deferring to ADTs (automation bias); or using ADTs together with their own judgment (an approach the author calls “artificing”). Interviews reveal that artificing is the most common use-type, followed by aversion, while deference is rare. Five conditions appear to influence how practitioners use ADTs: (a) understanding of the tool (b) perception of human judgment (c) seeing value in the tool (d) being offered opportunities to modify the tool (e) alignment of tool with expectations….(More)”.

The Coup We Are Not Talking About


Shoshana Zuboff in the New York Times: “Two decades ago, the American government left democracy’s front door open to California’s fledgling internet companies, a cozy fire lit in welcome. In the years that followed, a surveillance society flourished in those rooms, a social vision born in the distinct but reciprocal needs of public intelligence agencies and private internet companies, both spellbound by a dream of total information awareness. Twenty years later, the fire has jumped the screen, and on Jan. 6, it threatened to burn down democracy’s house.

I have spent exactly 42 years studying the rise of the digital as an economic force driving our transformation into an information civilization. Over the last two decades, I’ve observed the consequences of this surprising political-economic fraternity as those young companies morphed into surveillance empires powered by global architectures of behavioral monitoring, analysis, targeting and prediction that I have called surveillance capitalism. On the strength of their surveillance capabilities and for the sake of their surveillance profits, the new empires engineered a fundamentally anti-democratic epistemic coupmarked by unprecedented concentrations of knowledge about us and the unaccountable power that accrues to such knowledge.

In an information civilization, societies are defined by questions of knowledge — how it is distributed, the authority that governs its distribution and the power that protects that authority. Who knows? Who decides who knows? Who decides who decides who knows? Surveillance capitalists now hold the answers to each question, though we never elected them to govern. This is the essence of the epistemic coup. They claim the authority to decide who knows by asserting ownership rights over our personal information and defend that authority with the power to control critical information systems and infrastructures….(More)”.

Digital Age Samaritans


Paper by Zachary D. Kaufman: “Modern technology enables people to view, document, and share evidence of crimes contemporaneously or soon after commission. Electronic transmission of this material — including through social media and mobile devices — raises legal, moral, and practical questions about spectators’ responsibilities. In the digital age, will these actors be bystanders or upstanders? What role can and should the law play in shaping their behavior?

This Article argues that certain witnesses who are not physically present at the scene of a crime should be held criminally accountable for failing to report specified violent offenses. Focusing on rape, police brutality, and other misconduct, this Article demonstrates that recent technological innovations create new opportunities and challenges to pursue justice and accountability. Such culpability centers on “Bad Samaritan laws”: statutes that impose a legal duty to assist others in peril through intervening directly (also known as “the duty to rescue”) or notifying authorities (also known as “the duty to report”). However, many of these antiquated laws arguably apply only to witnesses who are physically present, which limits their potential effectiveness today.

Not all virtual witnesses should be subject to liability. To consider which categories of actors may warrant criminal punishment, this Article introduces a novel typology of bystanders and upstanders in the digital age. This typology draws on an original case study of the first known sexual crime livestreamed in the United States by a third party, which more than 700 people viewed. Harnessing insights from that case study and other episodes, the Article recommends that legislators should modernize, refine, and proliferate Bad Samaritan laws and that law enforcement should enforce these statutes or leverage them to obtain witness testimony. To that end, the Article proposes a model duty-to-report statute that includes features such as applicability to virtual presence and reasoned exemptions for noncompliance….(More)”.

Personal experiences bridge moral and political divides better than facts


Paper by Emily Kubin, Curtis Puryear, Chelsea Schein, and Kurt Gray: “All Americans are affected by rising political polarization, whether because of a gridlocked Congress or antagonistic holiday dinners. People believe that facts are essential for earning the respect of political adversaries, but our research shows that this belief is wrong. We find that sharing personal experiences about a political issue—especially experiences involving harm—help to foster respect via increased perceptions of rationality. This research provides a straightforward pathway for increasing moral understanding and decreasing political intolerance. These findings also raise questions about how science and society should understand the nature of truth in the era of “fake news.” In moral and political disagreements, everyday people treat subjective experiences as truer than objective facts….(More)”