Information Asymmetries, Blockchain Technologies, and Social Change


Reflections by Stefaan Verhulst on “the potential (and challenges) of Distributed Ledgers for “Market for Lemons” Conditions: We live in a data age, and it has become common to extol the transformative power of data and information. It is now conventional to assume that many of our most pressing public problems—everything from climate change to terrorism to mass migration—are amenable to a “data fix.”

The truth, though, is a little more complicated. While there is no doubt that data—when analyzed and used responsibly—holds tremendous potential, many factors affect whether, and to what extent, that potential will ultimately be fulfilled.

Our ability to address complex public problems using data depends vitally on how our respective data ecosystems is designed (as well as ongoing questions of representation in, power over, and stewardship of these ecosystems).

Flaws in our data ecosystem that prevent us from addressing problems; may also be responsible for many societal failures and inequalities result from the fact that:

  • some actors have better access to data than others;
  • data is of poor quality (or even “fake”); contains implicit bias; and/or is not validated and thus not trusted;
  • only easily accessible data are shared and integrated (“open washing”) while important data remain carefully hidden or without resources for relevant research and analysis; and more generally that
  • even in an era of big and open data, information too often remains stove-piped, siloed, and generally difficult to access.

Several observers have pointed to the relationship between these information asymmetries and, for example, corruption, financial exclusion, global pandemics, forced mass migration, human rights abuses, and electoral fraud.

Consider the transaction costs, power inequities and other obstacles that result from such information asymmetries, namely:

–     At the individual level: too often someone who is trying to open a bank account (or sign up for new cell phone service) is unable to provide all the requisite information, such as credit history, proof of address or other confirmatory and trusted attributes of identity. As such, information asymmetries are in effect limiting this individual’s access to financial and communications services.

–     At the corporate level, a vast body of literature in economics has shown how uncertainty over the quality and trustworthiness of data can impose transaction costs, limit the development of markets for goods and services, or shut them down altogether. This is the well-known “market for lemons” problem made famous in a 1970 paper of the same name by George Akerlof.

–     At the societal or governance level, information asymmetries don’t just affect the efficiency of markets or social inequality. They can also incentivize unwanted behaviors that cause substantial public harm. Tyrants and corrupt politicians thrive on limiting their citizens’ access to information (e.g., information related to bank accounts, investment patterns or disbursement of public funds). Likewise, criminals, operate and succeed in the information-scarce corners of the underground economy.

Blockchain technologies and Information Asymmetries

This is where blockchain comes in. At their core, blockchain technologies are a new type of disclosure mechanism that have the potential to address some of the information asymmetries listed above. There are many types of blockchain technologies, and while I use the blanket term ‘blockchain’ in the below for simplicity’s sake, the nuances between different types of blockchain technologies can greatly impact the character and likelihood of success of a given initiative.

By leveraging a shared and verified database of ledgers stored in a distributed manner, blockchain seeks to redesign information ecosystems in a more transparent, immutable, and trusted manner. Solving information asymmetries may be the real potential of blockchain, and this—much more than the current hype over virtual currencies—is the real reason to assess its potential….(More)”.

The Case for Accountability: How it Enables Effective Data Protection and Trust in the Digital Society


Centre for Information Policy Leadership: “Accountability now has broad international support and has been adopted in many laws, including in the EU General Data Protection Regulation (GDPR), regulatory policies and organisational practices. It is essential that there is consensus and clarity on the precise meaning and application of organisational accountability among all stakeholders, including organisations implementing accountability and data protection authorities (DPAs) overseeing accountability.

Without such consensus, organisations will not know what DPAs expect of them and DPAs will not know how to assess organisations’ accountability-based privacy programs with any degree of consistency and predictability. Thus, drawing from the global experience with accountability to date and from the Centre for Information Policy Leadership’s (CIPL) own extensive prior work on accountability, this paper seeks to explain the following issues:

  • The concept of organisational accountability and how it is reflected in the GDPR;
  • The essential elements of accountability and how the requirements of the GDPR (and of other normative frameworks) map to these elements;
  • Global acceptance and adoption of accountability;
  • How organisations can implement accountability (including by and between controllers and processors) through comprehensive internal privacy programs that implement external rules or the organisation’s own data protection policies and goals, or through verified or certified accountability mechanisms, such as Binding Corporate Rules (BCR), APEC Cross-Border Privacy Rules (CBPR), APEC Privacy Recognition for Processors (PRP), other seals and certifications, including future GDPR certifications and codes of conduct; and
  • The benefits that accountability can deliver to each stakeholder group.

In addition, the paper argues that accountability exists along a spectrum, ranging from basic accountability requirements required by law (such as under the GDPR) to stronger and more granular accountability measures that may not be required by law but that organisations may nevertheless want to implement because they convey substantial benefits….(More)”.

Ethics as Methods: Doing Ethics in the Era of Big Data Research—Introduction


Introduction to the Special issue of Social Media + Society on “Ethics as Methods: Doing Ethics in the Era of Big Data Research”: Building on a variety of theoretical paradigms (i.e., critical theory, [new] materialism, feminist ethics, theory of cultural techniques) and frameworks (i.e., contextual integrity, deflationary perspective, ethics of care), the Special Issue contributes specific cases and fine-grained conceptual distinctions to ongoing discussions about the ethics in data-driven research.

In the second decade of the 21st century, a grand narrative is emerging that posits knowledge derived from data analytics as true, because of the objective qualities of data, their means of collection and analysis, and the sheer size of the data set. The by-product of this grand narrative is that the qualitative aspects of behavior and experience that form the data are diminished, and the human is removed from the process of analysis.

This situates data science as a process of analysis performed by the tool, which obscures human decisions in the process. The scholars involved in this Special Issue problematize the assumptions and trends in big data research and point out the crisis in accountability that emerges from using such data to make societal interventions.

Our collaborators offer a range of answers to the question of how to configure ethics through a methodological framework in the context of the prevalence of big data, neural networks, and automated, algorithmic governance of much of human socia(bi)lity…(More)”.

Regulation by Blockchain: The Emerging Battle for Supremacy between the Code of Law and Code as Law


Paper by Karen Yeung at Modern Law Review: “Many advocates of distributed ledger technologies (including blockchain) claim that these technologies provide the foundations for an organisational form that will enable individuals to transact with each other free from the travails of conventional law, thus offering the promise of grassroots democratic governance without the need for third party intermediaries. But does the assumption that blockchain systems will operate beyond the reach of conventional law withstand critical scrutiny?

This is the question which this paper investigates, by examining the intersection and interactions between conventional law promulgated and enforced by national legal systems (ie the ‘code of law’) and the internal rules of blockchain systems which take the form of executable software code and cryptographic algorithms via a distributed computing network (‘code as law’).

It identifies three ways in which the code of law may interact with code as law, based primarily on the intended motives and purposes of those engaged in activities in developing, maintaining or undertaking transactions upon the network, referring to the use of blockchain: (a) with the express intention of evading the substantive limits of the law (‘hostile evasion’); (b) to complement and/or supplement conventional law with the aim of streamlining or enhancing compliance with agreed standards (‘efficient alignment’); and (c) to co-ordinate the actions of multiple participants via blockchain to avoid the procedural inefficiencies and complexities associated with the legal process, including the transaction, monitoring and agency costs associated with conventional law (‘alleviating transactional friction’).

These different classes of case are likely to generate different dynamic interactions between the blockchain code and conventional legal systems, which I describe respectively as ‘cat and mouse’, the ‘joys of (patriarchial) marriage’ and ‘uneasy coexistence and mutual suspicion’ respectively…(More)”.

Evaluating Civic Open Data Standards


Renee Sieber and Rachel Bloom at SocArXiv Papers: In many ways, a precondition to realizing the promise of open government data is the standardization of that data. Open data standards ensure interoperability, establish benchmarks in assessing whether governments achieve their goals in publishing open data, can better ensure accuracy of the data. Interoperability enables the use of off-the shelf software and can ease third party development of products that serves multiple locales.

Our project aims to determine which standards for civic data are “best” to open up government data. We began by disambiguating the multiple meanings of what constitutes a data standard by creating a standards stack.

The empirical research started by identifying twelve “high value” open datasets for which we found 22 data standards. A qualitative systematic review of the gray literature and standards documentation generated 18 evaluation metrics, which we grouped into four categories. We evaluated the metrics with civic data standards. Our goal is to identify and characterize types of standards and provide a systematic way to assess their quality…(More)”.

The Data Transfer Project


About: “The Data Transfer Project was formed in 2017 to create an open-source, service-to-service data portability platform so that all individuals across the web could easily move their data between online service providers whenever they want.

The contributors to the Data Transfer Project believe portability and interoperability are central to innovation. Making it easier for individuals to choose among services facilitates competition, empowers individuals to try new services and enables them to choose the offering that best suits their needs.

Current contributors include Facebook, Google, Microsoft and Twitter.

Individuals have many reasons to transfer data, but we want to highlight a few examples that demonstrate the additional value of service-to-service portability.

  • A user discovers a new photo printing service offering beautiful and innovative photo book formats, but their photos are stored in their social media account. With the Data Transfer Project, they could visit a website or app offered by the photo printing service and initiate a transfer directly from their social media platform to the photo book service.
  • A user doesn’t agree with the privacy policy of their music service. They want to stop using it immediately, but don’t want to lose the playlists they have created. Using this open-source software, they could use the export functionality of the original Provider to save a copy of their playlists to the cloud. This enables them to import the lists to a new Provider, or multiple Providers, once they decide on a new service.
  • A large company is getting requests from customers who would like to import data from a legacy Provider that is going out of business. The legacy Provider has limited options for letting customers move their data. The large company writes an Adapter for the legacy Provider’s Application Program Interfaces (APIs) that permits users to transfer data to their service, also benefiting other Providers that handle the same data type.
  • A user in a low bandwidth area has been working with an architect on drawings and graphics for a new house. At the end of the project, they both want to transfer all the files from a shared storage system to the user’s cloud storage drive. They go to the cloud storage Data Transfer Project User Interface (UI) and move hundreds of large files directly, without straining their bandwidth.
  • An industry association for supermarkets wants to allow customers to transfer their loyalty card data from one member grocer to another, so they can get coupons based on buying habits between stores. The Association would do this by hosting an industry-specific Host Platform of DTP.

The innovation in each of these examples lies behind the scenes: Data Transfer Project makes it easy for Providers to allow their customers to interact with their data in ways their customers would expect. In most cases, the direct-data transfer experience will be branded and managed by the receiving Provider, and the customer wouldn’t need to see DTP branding or infrastructure at all….

To get a more in-depth understanding of the project, its fundamentals and the details involved, please download “Data Transfer Project Overview and Fundamentals”….(More)”.

Blockchain is helping build a new Indian city, but it’s no cure for corruption


Ananya Bhattacharya at Quartz: “Last year, Tharigopula Sambasiva Rao entered into a deal with the state government of Andhra Pradesh. He gave up six acres of his agricultural land in his village, Sakhamuru, in exchange for 7,250 square yards—6,000 square yards of residential plots and 1,250 square yards of commercial ones.

In February this year, the 50-year-old farmer got his plots registered at the sub-registrar’s office in Thullur town of Guntur district. He booked an appointment through a government-run app and turned up with his Aadhaar number, a unique identity provided by the government of India to every citizen. Rao’s land documents, complete with a map, certificate, and carrying a unique QR code, were prepared by officials and sent directly to the registration office, all done in just a couple of hours.

Kommineni Ramanjaneyulu, another farmer from around Thullur, exchanged 4.5 acres for 10 plots. The 83-year-old was wary of this new technology deployed to streamline the land registration process. However, he was relieved to see the documents for his new assets in his native language, Telugu. There was no information gap….

In theory, blockchain can store land documents in a tamper-proof, secure network, reducing human interventions and adding more transparency. Data is solidified and the transaction history of a property is fully trackable. This has the potential to reduce, if not entirely prevent, property fraud. But unlike in the case of bitcoin, the blockchain utilised by the government agency in charge of shaping Amaravati is private.

So, despite the promise on paper, local landowners and farmers remain convinced that there’s no escaping red tape and corruption yet….

The entire documentation process for this massive exercise is based on blockchain. The decentralised distributed ledger system—central to cryptocurrencies like bitcoin and ether—can create foolproof digitised land registries of the residential and commercial plots allotted to farmers. It essentially serves as a book-keeping tool that can be accessed by all but is owned by none…

Having seen the government’s dirty tricks, most of the farmers gathered at Rayapudi aren’t buying the claim that the system is tamper-proof—especially at the stages before the information is moved to blockchain. After all, assignments and verifications are still being done by revenue officers on the ground.

That the Andhra Pradesh government is using a private blockchain complicates things further. The public can view information but not directly monitor whether any illicit changes have been made to their records. They have to go through the usual red tape to get those answers. The system may not be susceptible to hacking, but authorities could deliberately enter wrong information or refuse to reveal instances of fraud even if they are logged. This is the farmers’ biggest concern.

“The tampering cannot be stopped. If you give the right people a lot of bribe, they will go in and change the record,” said Seshagiri Rao. Nearly $700 million is paid in bribes across land registrars in India, an Andhra Pradesh government official estimated last year, and even probes into these matters are often flawed….(More)”.

How Mobile Network Operators Can Help Achieve the Sustainable Development Goals Profitably


Press Release: “Today, the Digital Impact Alliance (DIAL) released its second paper in a series focused on the promise of data for development (D4D). The paper, Leveraging Data for Development to Achieve Your Triple Bottom Line: Mobile Network Operators with Advanced Data for Good Capabilities See Stronger Impact to Profits, People and the Planet, will be presented at GSMA’s Mobile 360 Africa in Kigali.

“The mobile industry has already taken a driving seat in helping reach the Sustainable Development Goals by 2030 and this research reinforces the role mobile network operators in lower-income economies can play to leverage their network data for development and build a new data business safely and securely,” said Kate Wilson, CEO of the Digital Impact Alliance. “Mobile network operators (MNOs) hold unique data on customers’ locations and behaviors that can help development efforts. They have been reluctant to share data because there are inherent business risks and to do so has been expensive and time consuming.  DIAL’s research illustrates a path forward for MNOs on which data is useful to achieve the SDGs and why acting now is critical to building a long-term data business.”

DIAL worked with Altai Consulting on both primary and secondary research to inform this latest paper.  Primary research included one-on-one in-depth interviews with more than 50 executives across the data for development value chain, including government officials, civil society leaders, mobile network operators and other private sector representatives from both developed and emerging markets. These interviews help inform how operators can best tap into the shared value creation opportunities data for development provides.

Key findings from the in-depth interviews include:

  • There are several critical barriers that have prevented scaled use of mobile data for social good – including 1) unclear market opportunities, 2) not enough collaboration among MNOs, governments and non-profit stakeholders and 3) regulatory and privacy concerns;
  • While it may be an ideal time for MNOs to increase their involvement in D4D efforts given the unique data they have that can inform development, market shifts suggest the window of opportunity to implement large-scale D4D initiatives will likely not remain open for much longer;
  • Mobile Network Operators with advanced data for good capabilities will have the most success in establishing sustainable D4D efforts; and as a result, achieving triple bottom line mandates; and
  • Mobile Network Operators should focus on providing value-added insights and services rather than raw data and drive pricing and product innovation to meet the sector’s needs.

“Private sector data availability to drive public sector decision-making is a critical enabler for meeting SDG targets,” said Syed Raza, Senior Director of the Data for Development Team at the Digital Impact Alliance.  “Our data for development paper series aims to elevate the efforts of our industry colleagues with the information, insights and tools they need to help drive ethical innovation in this space….(More)”.

Let’s make private data into a public good


Article by Mariana Mazzucato: “The internet giants depend on our data. A new relationship between us and them could deliver real value to society….We should ask how the value of these companies has been created, how that value has been measured, and who benefits from it. If we go by national accounts, the contribution of internet platforms to national income (as measured, for example, by GDP) is represented by the advertisement-related services they sell. But does that make sense? It’s not clear that ads really contribute to the national product, let alone to social well-being—which should be the aim of economic activity. Measuring the value of a company like Google or Facebook by the number of ads it sells is consistent with standard neoclassical economics, which interprets any market-based transaction as signaling the production of some kind of output—in other words, no matter what the thing is, as long as a price is received, it must be valuable. But in the case of these internet companies, that’s misleading: if online giants contribute to social well-being, they do it through the services they provide to users, not through the accompanying advertisements.

This way we have of ascribing value to what the internet giants produce is completely confusing, and it’s generating a paradoxical result: their advertising activities are counted as a net contribution to national income, while the more valuable services they provide to users are not.

Let’s not forget that a large part of the technology and necessary data was created by all of us, and should thus belong to all of us. The underlying infrastructure that all these companies rely on was created collectively (via the tax dollars that built the internet), and it also feeds off network effects that are produced collectively. There is indeed no reason why the public’s data should not be owned by a public repository that sells the data to the tech giants, rather than vice versa. But the key issue here is not just sending a portion of the profits from data back to citizens but also allowing them to shape the digital economy in a way that satisfies public needs. Using big data and AI to improve the services provided by the welfare state—from health care to social housing—is just one example.

Only by thinking about digital platforms as collective creations can we construct a new model that offers something of real value, driven by public purpose. We’re never far from a media story that stirs up a debate about the need to regulate tech companies, which creates a sense that there’s a war between their interests and those of national governments. We need to move beyond this narrative. The digital economy must be subject to the needs of all sides; it’s a partnership of equals where regulators should have the confidence to be market shapers and value creators….(More)”.

Artificial Intelligence


Stanford Encyclopedia of Philosophy: “Artificial intelligence (AI) is the field devoted to building artificial animals (or at least artificial creatures that – in suitable contexts – appear to be animals) and, for many, artificial persons (or at least artificial creatures that – in suitable contexts – appear to be persons).[1] Such goals immediately ensure that AI is a discipline of considerable interest to many philosophers, and this has been confirmed (e.g.) by the energetic attempt, on the part of numerous philosophers, to show that these goals are in fact un/attainable. On the constructive side, many of the core formalisms and techniques used in AI come out of, and are indeed still much used and refined in, philosophy: first-order logic and its extensions; intensional logics suitable for the modeling of doxastic attitudes and deontic reasoning; inductive logic, probability theory, and probabilistic reasoning; practical reasoning and planning, and so on. In light of this, some philosophers conduct AI research and development as philosophy.

In the present entry, the history of AI is briefly recounted, proposed definitions of the field are discussed, and an overview of the field is provided. In addition, both philosophical AI (AI pursued as and out of philosophy) and philosophy of AI are discussed, via examples of both. The entry ends with some de rigueur speculative commentary regarding the future of AI….(More)”.