Investigation of Competition in Digital Markets


Press Release: “The House Judiciary Committee’s Antitrust Subcommittee today released the findings of its more than 16-month long investigation into the state of competition in the digital economy, especially the challenges presented by the dominance of Apple, Amazon, Google, and Facebook and their business practices.

The report, entitled Investigation of Competition in the Digital Marketplace: Majority Staff Report and Recommendations, totals more than 400 pages, marking the culmination of an investigation that included seven congressional hearings, the production of nearly 1.3 million internal documents and communications, submissions from 38 antitrust experts, and interviews with more than 240 market participants, former employees of the investigated platforms, and other individuals. It can be downloaded by clicking here.

“As they exist today, Apple, Amazon, Google, and Facebook each possess significant market power over large swaths of our economy. In recent years, each company has expanded and exploited their power of the marketplace in anticompetitive ways,” said Judiciary Committee Chairman Jerrold Nadler (NY-10) and Antitrust Subcommittee Chairman David N. Cicilline (RI-01) in a joint statement. “Our investigation leaves no doubt that there is a clear and compelling need for Congress and the antitrust enforcement agencies to take action that restores competition, improves innovation, and safeguards our democracy. This Report outlines a roadmap for achieving that goal.”

After outlining the challenges presented due to the market domination of Amazon, Apple, Google, and Facebook, the report walks through a series of possible remedies to (1) restore competition in the digital economy, (2) strengthen the antitrust laws, and (3) reinvigorate antitrust enforcement.

The slate of recommendations include:

  • Structural separations to prohibit platforms from operating in lines of business that depend on or interoperate with the platform;
  • Prohibiting platforms from engaging in self-preferencing;
  • Requiring platforms to make its services compatible with competing networks to allow for interoperability and data portability;
  • Mandating that platforms provide due process before taking action against market participants;
  • Establishing a standard to proscribe strategic acquisitions that reduce competition;
  • Improvements to the Clayton Act, the Sherman Act, and the Federal Trade Commission Act, to bring these laws into line with the challenges of the digital economy;
  • Eliminating anticompetitive forced arbitration clauses;
  • Strengthening the Federal Trade Commission (FTC) and the Antitrust Division of the Department of Justice;
  • And promoting greater transparency and democratization of the antitrust agencies….(More)”.

Building the World We Deserve: A New Framework for Infrastructure


Introductory letter to a new whitepaper published by Siegel Family Endowment that outlines a new framework for understanding and funding infrastructure: “This story begins, as many set in New York City do, with the subway. As transportation enthusiasts, we’re fascinated by trains, especially the remarkable system that runs above and below the city’s streets. It was the discovery of this shared passion for understanding how our subway system works that got us talking about infrastructure a few years ago.

Infrastructure, in the most traditional sense, brings to mind physical constructions: city streets, power lines, the pipes that carry water into your home. But what about all the other things that make society function? Having seen the decline in investment in the country’s physical infrastructure, and aware of the many ways the digital world is upending our definition of the term, we began exploring how Siegel Family Endowment could play a role in the future of infrastructure.

Over the past two years of research and conversations with partners across the field, we’ve realized that our nation’s infrastructure is due for a reset. Hearing the term should evoke a different image: an interconnected web of assets, seen and unseen, that make up the foundation upon which the complicated machinery of modern society operates. It’s inherently multidimensional.

In 2020, the United States has reckoned with a health pandemic and a watershed moment in the fight for racial equity. These challenges highlight how relevant it is to reconsider what society deems the most critical, foundational assets for its citizens—and to ensure they have access to those assets.

Funding infrastructure is often considered the responsibility of government agencies. Yet many of our peers in philanthropy have made important investments in the field. These include working with local governments to fund research, promote novel forms of public-private partnership, and, ultimately, better serve citizens. And if infrastructure is viewed through the broader lens we argue for in this paper, it becomes clear just how much philanthropy, the nonprofit sector, and private entities are investing in our digital and social ecosystems.

We believe that we can do more—and better—if we commit as a country to adopting some of the principles outlined in this paper. However, we also consider this the beginning of a conversation. The time for us to think bigger and bolder about infrastructure is here. Our challenge now is to design it so that more people may thrive….(More)”.

Lessons learned from AI ethics principles for future actions


Paper by Merve Hickok: “As the use of artificial intelligence (AI) systems became significantly more prevalent in recent years, the concerns on how these systems collect, use and process big data also increased. To address these concerns and advocate for ethical and responsible development and implementation of AI, non-governmental organizations (NGOs), research centers, private companies, and governmental agencies published more than 100 AI ethics principles and guidelines. This first wave was followed by a series of suggested frameworks, tools, and checklists that attempt a technical fix to issues brought up in the high-level principles. Principles are important to create a common understanding for priorities and are the groundwork for future governance and opportunities for innovation. However, a review of these documents based on their country of origin and funding entities shows that private companies from US-West axis dominate the conversation. Several cases surfaced in the meantime which demonstrate biased algorithms and their impact on individuals and society. The field of AI ethics is urgently calling for tangible action to move from high-level abstractions and conceptual arguments towards applying ethics in practice and creating accountability mechanisms. However, lessons must be learned from the shortcomings of AI ethics principles to ensure the future investments, collaborations, standards, codes or legislation reflect the diversity of voices and incorporate the experiences of those who are already impacted by the biased algorithms….(More)”.

Social license for the use of big data in the COVID-19 era


Commentary by James A. Shaw, Nayha Sethi & Christine K. Cassel: “… Social license refers to the informal permissions granted to institutions such as governments or corporations by members of the public to carry out a particular set of activities. Much of the literature on the topic of social license has arisen in the field of natural resources management, emphasizing issues that include but go beyond environmental stewardship4. In their seminal work on social license in the pulp and paper industry, Gunningham et al. defined social license as the “demands and expectations” placed on organizations by members of civil society which “may be tougher than those imposed by regulation”; these expectations thereby demand actions that go beyond existing legal rules to demonstrate concern for the interests of publics. We use the plural term “publics” as opposed to the singular “public” to illustrate that stakeholder groups to which organizations must appeal are often diverse and varied in their assessments of whether a given organizational activity is acceptable6. Despite the potentially fragmented views of various publics, the concept of social license is considered in a holistic way (either an organization has it or does not). Social license is closely related to public trust, and where publics view a particular institution as trustworthy it is more likely to have social license to engage in activities such as the collection and use of personal data7.

The question of how the leaders of an organization might better understand whether they have social license for a particular set of activities has also been addressed in the literature. In a review of literature on social license, Moffat et al. highlighted disagreement in the research community about whether social license can be accurately measured4. Certain groups of researchers emphasize that because of the intangible nature of social license, accurate measurement will never truly be possible. Others propose conceptual models of the determinants of social license, and establish surveys that assess those determinants to indicate the presence or absence of social license in a given context. However, accurate measurement of social license remains a point of debate….(More)”.

Blockchain Chicken Farm: And Other Stories of Tech in China’s Countryside


Book by By Xiaowei R. Wang: “In Blockchain Chicken Farm, the technologist and writer Xiaowei Wang explores the political and social entanglements of technology in rural China. Their discoveries force them to challenge the standard idea that rural culture and people are backward, conservative, and intolerant. Instead, they find that rural China has not only adapted to rapid globalization but has actually innovated the technology we all use today. From pork farmers using AI to produce the perfect pig, to disruptive luxury counterfeits and the political intersections of e-commerce villages, Wang unravels the ties between globalization, technology, agriculture, and commerce in unprecedented fashion. Accompanied by humorous “Sinofuturist” recipes that frame meals as they transform under new technology, Blockchain Chicken Farm is an original and probing look into innovation, connectivity, and collaboration in the digitized rural world.

FSG Originals × Logic dissects the way technology functions in everyday lives. The titans of Silicon Valley, for all their utopian imaginings, never really had our best interests at heart: recent threats to democracy, truth, privacy, and safety, as a result of tech’s reckless pursuit of progress, have shown as much. We present an alternate story, one that delights in capturing technology in all its contradictions and innovation, across borders and socioeconomic divisions, from history through the future, beyond platitudes and PR hype, and past doom and gloom. Our collaboration features four brief but provocative forays into the tech industry’s many worlds, and aspires to incite fresh conversations about technology focused on nuanced and accessible explorations of the emerging tools that reorganize and redefine life today….(More)”.

Situating Open Data: Global Trends in Local Contexts


Open Access Book edited by Danny Lämmerhirt, Ana Brandusescu, Natalia Domagala & Patrick Enaholo: “Open data and its effects on society are always woven into infrastructural legacies, social relations, and the political economy. This raises questions about how our understanding and engagement with open data shifts when we focus on its situated use. 

To shed a light on these questions, Situating Open Data provides several empirical accounts of open data practices, the local implementation of global initiatives, and the development of new open data ecosystems. Drawing on case studies in different countries and contexts, the chapters demonstrate the practices and actors involved in open government data initiatives unfolding within different socio-political settings. 

The book proposes three recommendations for researchers, policy-makers and practitioners. First, beyond upskilling through ‘data literacy’ programmes, open data initiatives should be specified through the kinds of data practices and effects they generate. Second, global visions of open data implementation require more studies of the resonances and tensions created in localised initiatives. And third, research into open data ecosystems requires more attention to the histories and legacies of information infrastructures and how these shape who benefits from open data flows. 

As such, this volume departs from the framing of data as a resource to be deployed. Instead, it proposes a prism of different data practices in different contexts through which to study the social relations, capacities, infrastructural histories and power structures affecting open data initiatives. It is hoped that the contributions collected in Situating Open Data will spark critical reflection about the way open data is locally practiced and implemented. The contributions should be of interest to open data researchers, advocates, and those in or advising government administrations designing and rolling out effective open data initiatives….(More)”.

The Wisdom of the Crowd: Promoting Media Development through Deliberative Initiatives


Report by Craig Matasick: “…innovative new set of citizen engagement practices—collectively known as deliberative democracy—offers important lessons that, when applied to the media development efforts, can help improve media assistance efforts and strengthen independent media environments around the world. At a time when disinformation runs rampant, it is more important than ever to strengthen public demand for credible information, reduce political polarization, and prevent media capture. Deliberative democracy approaches can help tackle these issues by expanding the number and diversity of voices that participate in policymaking, thereby fostering greater collective action and enhancing public support for media reform efforts.

Through a series of five illustrative case studies, the report demonstrates how deliberative democracy practices can be employed in both media development and democracy assistance efforts, particularly in the Global South. Such initiatives produce recommendations that take into account a plurality of voices while building trust between citizens and decision-makers by demonstrating to participants that their issues will be heard and addressed. Ultimately, this process can enable media development funders and practitioners to identify priorities and design locally relevant projects that have a higher likelihood for long-term impact.

– Deliberative democracy approaches, which are characterized by representative participation and moderated deliberation, provide a framework to generate demand-driven media development interventions while at the same time building greater public support for media reform efforts.

– Deliberative democracy initiatives foster collaboration across different segments of society, building trust in democratic institutions, combatting polarization, and avoiding elite capture.

– When employed by news organizations, deliberative approaches provide a better understanding of the issues their audiences care most about and uncover new problems affecting citizens that might not otherwise have come to light….(More)”.

Metrics at Work: Journalism and the Contested Meaning of Algorithms


Book by Angèle Christin: “When the news moved online, journalists suddenly learned what their audiences actually liked, through algorithmic technologies that scrutinize web traffic and activity. Has this advent of audience metrics changed journalists’ work practices and professional identities? In Metrics at Work, Angèle Christin documents the ways that journalists grapple with audience data in the form of clicks, and analyzes how new forms of clickbait journalism travel across national borders.

Drawing on four years of fieldwork in web newsrooms in the United States and France, including more than one hundred interviews with journalists, Christin reveals many similarities among the media groups examined—their editorial goals, technological tools, and even office furniture. Yet she uncovers crucial and paradoxical differences in how American and French journalists understand audience analytics and how these affect the news produced in each country. American journalists routinely disregard traffic numbers and primarily rely on the opinion of their peers to define journalistic quality. Meanwhile, French journalists fixate on internet traffic and view these numbers as a sign of their resonance in the public sphere. Christin offers cultural and historical explanations for these disparities, arguing that distinct journalistic traditions structure how journalists make sense of digital measurements in the two countries.

Contrary to the popular belief that analytics and algorithms are globally homogenizing forces, Metrics at Work shows that computational technologies can have surprisingly divergent ramifications for work and organizations worldwide….(More)”.

An Open-Source Tool to Accelerate Scientific Knowledge Discovery


Mozilla: “Timely and open access to novel outputs is key to scientific research. It allows scientists to reproduce, test, and build on one another’s work — and ultimately unlock progress.

The most recent example of this is the research into COVID-19. Much of the work was published in open access journals, swiftly reviewed and ultimately improving our understanding of how to slow the spread and treat the disease. Although this rapid increase in scientific publications is evident in other domains too, we might not be reaping the benefits. The tools to parse and combine this newly created knowledge have roughly remained the same for years.

Today, Mozilla Fellow Kostas Stathoulopoulos is launching Orion — an open-source tool to illuminate the science behind the science and accelerate knowledge discovery in the life sciences. Orion enables users to monitor progress in science, visually explore the scientific landscape, and search for relevant publications.

Orion

Orion collects, enriches and analyses scientific publications in the life sciences from Microsoft Academic Graph.

Users can leverage Orion’s views to interact with the data. The Exploration view shows all of the academic publications in a three-dimensional visualization. Every particle is a paper and the distance between them signifies their semantic similarity; the closer two particles are, the more semantically similar. The Metrics view visualizes indicators of scientific progress and how they have changed over time for countries and thematic topics. The Search view enables the users to search for publications by submitting either a keyword or a longer query, for example, a sentence or a paragraph of a blog they read online….(More)”.

Why Modeling the Spread of COVID-19 Is So Damn Hard



Matthew Hutson at IEEE Spectrum: “…Researchers say they’ve learned a lot of lessons modeling this pandemic, lessons that will carry over to the next.

The first set of lessons is all about data. Garbage in, garbage out, they say. Jarad Niemi, an associate professor of statistics at Iowa State University who helps run the forecast hub used by the CDC, says it’s not clear what we should be predicting. Infections, deaths, and hospitalization numbers each have problems, which affect their usefulness not only as inputs for the model but also as outputs. It’s hard to know the true number of infections when not everyone is tested. Deaths are easier to count, but they lag weeks behind infections. Hospitalization numbers have immense practical importance for planning, but not all hospitals release those figures. How useful is it to predict those numbers if you never have the true numbers for comparison? What we need, he said, is systematized random testing of the population, to provide clear statistics of both the number of people currently infected and the number of people who have antibodies against the virus, indicating recovery. Prakash, of Georgia Tech, says governments should collect and release data quickly in centralized locations. He also advocates for central repositories of policy decisions, so modelers can quickly see which areas are implementing which distancing measures.

Researchers also talked about the need for a diversity of models. At the most basic level, averaging an ensemble of forecasts improves reliability. More important, each type of model has its own uses—and pitfalls. An SEIR model is a relatively simple tool for making long-term forecasts, but the devil is in the details of its parameters: How do you set those to match real-world conditions now and into the future? Get them wrong and the model can head off into fantasyland. Data-driven models can make accurate short-term forecasts, and machine learning may be good for predicting complicated factors. But will the inscrutable computations of, for instance, a neural network remain reliable when conditions change? Agent-based models look ideal for simulating possible interventions to guide policy, but they’re a lot of work to build and tricky to calibrate.

Finally, researchers emphasize the need for agility. Niemi of Iowa State says software packages have made it easier to build models quickly, and the code-sharing site GitHub lets people share and compare their models. COVID-19 is giving modelers a chance to try out all their newest tools, says Meyers, of the University of Texas. “The pace of innovation, the pace of development, is unlike ever before,” she says. “There are new statistical methods, new kinds of data, new model structures.”…(More)”.