Tech is finally killing long lines


Erica Pandey at Axios: “Startups and big corporations alike are releasing technology to put long lines online.

Why it matters: Standing in lines has always been a hassle, but the pandemic has made lines longer, slower and even dangerous. Now many of those lines are going virtual.

What’s happening: Physical lines are disappearing at theme parks, doctor’s offices, clothing stores and elsewhere, replaced by systems that let you book a slot online and then wait to be notified that it’s your turn.

Whyline, an Argentinian company that was just acquired by the biometric ID company CLEAR, is an app that lets users do just that — it will keep you up to date on your wait time and let you know when you need to show up.

  • Whyline’s list of clients — mostly in Latin America — includes banks, retail stores, the city of Lincoln, Nebraska, and Los Angeles International Airport.
  • “The same way you make a reservation at a restaurant, Whyline software does the waiting for you in banks, in DMVs, in airports,” CLEAR CEO Caryn Seidman-Becker said on CNBC.

Another app called Safe Queue was born from the pandemic and aims to make in-store shopping safer for customers and workers by spacing out shoppers’ visits.

  • The app uses GPS technology to detect when you’re within 1,000 feet of a participating store and automatically puts you in a virtual line. Then you can wait in your car or somewhere nearby until it’s your turn to shop.

Many health clinics around the country are also putting their COVID test lines online..

The rub: While virtual queuing tech may be gaining ground, lines are still more common than not. And in the age of social distancing, expect wait times to remain high and lines to remain long…(More)”.

Empowering AI Leadership: AI C-Suite Toolkit


Toolkit by the World Economic Forum: “Artificial intelligence (AI) is one of the most important technologies for business, the economy and society and a driving force behind the Fourth Industrial Revolution. C-suite executives need to understand its possibilities and risks. This requires a multifaceted approach and holistic grasp of AI, spanning technical, organizational, regulatory, societal and also philosophical aspects. This toolkit provides a one-stop place for corporate executives to identify and understand the multiple and complex issues that AI raises for their business and society. It provides a practical set of tools to help them comprehend AI’s impact on their roles, ask the right questions, identify the key trade-offs and make informed decisions on AI strategy, projects and implementations…(More)”.

Interoperable, agile, and balanced


Brookings Paper on Rethinking technology policy and governance for the 21st century: “Emerging technologies are shifting market power and introducing a range of risks that can only be managed through regulation. Unfortunately, current approaches to governing technology are insufficient, fragmented, and lack the focus towards actionable goals. This paper proposes three tools that can be leveraged to support fit-for-purpose technology regulation for the 21st century: First, a transparent and holistic policymaking levers that clearly communicate goals and identify trade-offs at the national and international levels; second, revamped efforts to collaborate across jurisdictions, particularly through standard-setting and evidence gathering of critical incidents across jurisdictions; and third, a shift towards agile governance, whether acquired through the system, design, or both…(More)”.

‘In Situ’ Data Rights


Essay by Marshall W Van Alstyne, Georgios Petropoulos, Geoffrey Parker, and Bertin Martens: “…Data portability sounds good in theory—number portability improved telephony—but this theory has its flaws.

  • Context: The value of data depends on context. Removing data from that context removes value. A portability exercise by experts at the ProgrammableWeb succeeded in downloading basic Facebook data but failed on a re-upload.1 Individual posts shed the prompts that preceded them and the replies that followed them. After all, that data concerns others.
  • Stagnation: Without a flow of updates, a captured stock depreciates. Data must be refreshed to stay current, and potential users must see those data updates to stay informed.
  • Impotence: Facts removed from their place of residence become less actionable. We cannot use them to make a purchase when removed from their markets or reach a friend when they are removed from their social networks. Data must be reconnected to be reanimated.
  • Market Failure. Innovation is slowed. Consider how markets for business analytics and B2B services develop. Lacking complete context, third parties can only offer incomplete benchmarking and analysis. Platforms that do offer market overview services can charge monopoly prices because they have context that partners and competitors do not.
  • Moral Hazard: Proposed laws seek to give merchants data portability rights but these entail a problem that competition authorities have not anticipated. Regulators seek to help merchants “multihome,” to affiliate with more than one platform. Merchants can take their earned ratings from one platform to another and foster competition. But, when a merchant gains control over its ratings data, magically, low reviews can disappear! Consumers fraudulently edited their personal records under early U.K. open banking rules. With data editing capability, either side can increase fraud, surely not the goal of data portability.

Evidence suggests that following GDPR, E.U. ad effectiveness fell, E.U. Web revenues fell, investment in E.U. startups fell, the stock and flow of apps available in the E.U. fell, while Google and Facebook, who already had user data, gained rather than lost market share as small firms faced new hurdles the incumbents managed to avoid. To date, the results are far from regulators’ intentions.

We propose a new in situ data right for individuals and firms, and a new theory of benefits. Rather than take data from the platform, or ex situ as portability implies, let us grant users the right to use their data in the location where it resides. Bring the algorithms to the data instead of bringing the data to the algorithms. Users determine when and under what conditions third parties access their in situ data in exchange for new kinds of benefits. Users can revoke access at any time and third parties must respect that. This patches and repairs the portability problems…(More).”

Mapping data portability initiatives, opportunities and challenges


OECD Report: “Data portability has become an essential tool for enhancing access to and sharing of data across digital services and platforms. This report explores to what extent data portability can empower users (natural and legal persons) to play a more active role in the re-use of their data across digital services and platforms. It also examines how data portability can help increase interoperability and data flows and thus enhance competition and innovation by reducing switching costs and lock-in effects….(More)”.

The argument against property rights in data


Report by Open Future: “25 years after the adoption of the Database Directive, there is mounting evidence that the introduction of the sui generis right did not lead to increased data access and use–instead, an additional intellectual property layer became one more obstacle.

Today, the European Commission, as it drafts the new Data Act, faces a fundamental choice both regarding the existing sui generis database rights and the introduction of a similar right to raw, machine-generated data. There is a risk that an approach that treats data as property will be further strengthened through a new data producer’s right. The idea of such a new exclusive right was introduced by the European Commission in 2017. This proposed right was to be based on the same template as the sui generis database right. 

A new property right will not secure the goals defined in the European data strategy: those of ensuring access and use of data, in a data economy built around common data spaces. Instead, they will strengthen existing monopolies in the data economy. 

Instead of introducing new property rights, greater access to and use of data should be achieved by introducing–in the Data Act, and in other currently debated legal acts–access rights that treat data as a commons. 

In this policy brief, we present the current policy debate on access and use of data, as well as the history of proposals for property rights in data – including the sui generis database right. We present arguments against the introduction of new property rights, and in favor of strengthening data access rights….(More)”.

Regulating New Tech: Problems, Pathways, and People


Paper by Cary Coglianese: “New technologies bring with them many promises, but also a series of new problems. Even though these problems are new, they are not unlike the types of problems that regulators have long addressed in other contexts. The lessons from regulation in the past can thus guide regulatory efforts today. Regulators must focus on understanding the problems they seek to address and the causal pathways that lead to these problems. Then they must undertake efforts to shape the behavior of those in industry so that private sector managers focus on their technologies’ problems and take actions to interrupt the causal pathways. This means that regulatory organizations need to strengthen their own technological capacities; however, they need most of all to build their human capital. Successful regulation of technological innovation rests with top quality people who possess the background and skills needed to understand new technologies and their problems….(More)”.

The state of AI in 2021


McKinsey Global Survey on AI: “..indicate that AI adoption continues to grow and that the benefits remain significant— though in the COVID-19 pandemic’s first year, they were felt more strongly on the cost-savings front than the top line. As AI’s use in business becomes more common, the tools and best practices to make the most out of AI have also become more sophisticated. We looked at the practices of the companies seeing the biggest earnings boost from AI and found that they are not only following more of both the core and advanced practices, including machine-learning operations (MLOps), that underpin success but also spending more efficiently on AI and taking more advantage of cloud technologies. Additionally, they are more likely than other organizations to engage in a range of activities to mitigate their AI-related risks—an area that continues to be a shortcoming for many companies’ AI efforts…(More)”.

Business Data Sharing through Data Marketplaces: A Systematic Literature Review


Paper by Abbas, Antragama E., Wirawan Agahari, Montijn van de Ven, Anneke Zuiderwijk, and Mark de Reuver: “Data marketplaces are expected to play a crucial role in tomorrow’s data economy, but such marketplaces are seldom commercially viable. Currently, there is no clear understanding of the knowledge gaps in data marketplace research, especially not of neglected research topics that may advance such marketplaces toward commercialization. This study provides an overview of the state-of-the-art of data marketplace research. We employ a Systematic Literature Review (SLR) approach to examine 133 academic articles and structure our analysis using the Service-Technology-Organization-Finance (STOF) model. We find that the extant data marketplace literature is primarily dominated by technical research, such as discussions about computational pricing and architecture. To move past the first stage of the platform’s lifecycle (i.e., platform design) to the second stage (i.e., platform adoption), we call for empirical research in non-technological areas, such as customer expected value and market segmentation….(More)”.

Articulating Value from Data


Report by the World Economic Forum: “The distinct characteristics and dynamics of data – contextual, relational and cumulative – call for new approaches to articulating its value. Businesses should value data based on cases that go beyond the transactional monetization of data and take into account the broader context, future opportunities to collaborate and innovate, and value created for its ecosystem stakeholders. Doing so will encourage companies to think about the future value data can help generate, beyond the existing data lakes they sit on, and open them up to collaboration opportunities….(More)”.