Automation exacts a toll in inequality


Rana Foroohar at The Financial Times: “When humans compete with machines, wages go down and jobs go away. But, ultimately, new categories of better work are created. The mechanisation of agriculture in the first half of the 20th century, or advances in computing and communications technology in the 1950s and 1960s, for example, went hand in hand with strong, broadly shared economic growth in the US and other developed economies.

But, in later decades, something in this relationship began to break down. Since the 1980s, we’ve seen the robotics revolution in manufacturing; the rise of software in everything; the consumer internet and the internet of things; and the growth of artificial intelligence. But during this time trend GDP growth in the US has slowed, inequality has risen and many workers — particularly, men without college degrees — have seen their real earnings fall sharply.

Globalisation and the decline of unions have played a part. But so has technological job disruption. That issue is beginning to get serious attention in Washington. In particular, politicians and policymakers are homing in on the work of MIT professor Daron Acemoglu, whose research shows that mass automation is no longer a win-win for both capital and labour. He testified at a select committee hearing to the US House of Representatives in November that automation — the substitution of machines and algorithms for tasks previously performed by workers — is responsible for 50-70 per cent of the economic disparities experienced between 1980 and 2016.

Why is this happening? Basically, while the automation of the early 20th century and the post-1945 period “increased worker productivity in a diverse set of industries and created myriad opportunities for them”, as Acemoglu said in his testimony, “what we’ve experienced since the mid 1980s is an acceleration in automation and a very sharp deceleration in the introduction of new tasks”. Put simply, he added, “the technological portfolio of the American economy has become much less balanced, and in a way that is highly detrimental to workers and especially low-education workers.”

What’s more, some things we are automating these days aren’t so economically beneficial. Consider those annoying computerised checkout stations in drug stores and groceries that force you to self-scan your purchases. They may save retailers a bit in labour costs, but they are hardly the productivity enhancer of, say, a self-driving combine harvester. Cecilia Rouse, chair of the White House’s Council of Economic Advisers, spoke for many when she told a Council on Foreign Relations event that she’d rather “stand in line [at the pharmacy] so that someone else has a job — it may not be a great job, but it is a job — and where I actually feel like I get better assistance.”

Still, there’s no holding back technology. The question is how to make sure more workers can capture its benefits. In her “Virtual Davos” speech a couple of weeks ago, Treasury secretary Janet Yellen pointed out that recent technologically driven productivity gains might exacerbate rather than mitigate inequality. She pointed to the fact that, while the “pandemic-induced surge in telework” will ultimately raise US productivity by 2.7 per cent, the gains will accrue mostly to upper income, white-collar workers, just as online learning has been better accessed and leveraged by wealthier, white students.

Education is where the rubber meets the road in fixing technology-driven inequality. As Harvard researchers Claudia Goldin and Laurence Katz have shown, when the relationship between education and technology gains breaks down, tech-driven prosperity is no longer as widely shared. This is why the Biden administration has been pushing investments into community college, apprenticeships and worker training…(More)”.

Public Provides NASA with New Innovations through Prize Competitions, Crowdsourcing, Citizen Science Opportunities


NASA Report: “Whether problem-solving during the pandemic, establishing a long-term presence at the Moon, or advancing technology to adapt to life in space, NASA has leveraged open innovation tools to inspire solutions to some of our most timely challenges – while using the creativity of everyone from garage tinkerers to citizen scientists and students of all ages.

Open Innovation: Boosting NASA Higher, Faster, and Farther highlights some of those breakthroughs, which accelerate space technology development and discovery while giving the public a gateway to work with NASA. Open innovation initiatives include problem-focused challenges and prize competitions, data hackathons, citizen science, and crowdsourcing projects that invite the public to lend their skills, ideas, and time to support NASA research and development programs.

NASA engaged the public with 56 public prize competitions and challenges and 14 citizen science and crowdsourcing activities over fiscal years 2019 and 2020. NASA awarded $2.2 million in prize money, and members of the public submitted over 11,000 solutions during that period.

“NASA’s accomplishments have hardly been NASA’s alone. Tens of thousands more individuals from academic institutions, private companies, and other space agencies also contribute to these solutions. Open innovation expands the NASA community and broadens the agency’s capacity for innovation and discovery even further,” said Amy Kaminski, Prizes, Challenges, and Crowdsourcing program executive at NASA Headquarters in Washington. “We harness the perspectives, expertise, and enthusiasm of ‘the crowd’ to gain diverse solutions, speed up projects, and reduce costs.”

This edition of the publication highlights:

  • How NASA used open innovation tools to accelerate the pace of problem-solving during the COVID-19 pandemic, enabling a sprint of creativity to create valuable solutions in support of this global crisis
  • How NASA invited everyone to embrace the Moon as a technological testing ground through public prize competitions and challenges, sparking development that could help prolong human stays on the Moon and lay the foundation for human exploration to Mars and beyond  
  • How citizen scientists gather, sort, and upload data, resulting in fruitful partnerships between the public and NASA scientists
  • How NASA’s student-focused challenges have changed lives and positively impacted underserved communities…(More)”.

Suicide hotline shares data with for-profit spinoff, raising ethical questions


Alexandra Levine at Politico: “Crisis Text Line is one of the world’s most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.

But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization’s for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.

Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly “anonymized,” stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world — in Loris’ case, by making “customer support more human, empathetic, and scalable.”

In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive…(More).”

When Do Informational Interventions Work? Experimental Evidence from New York City High School Choice


Paper by Sarah Cohodes, Sean Corcoran, Jennifer Jennings & Carolyn Sattin-Bajaj: “This paper reports the results of a large, school-level randomized controlled trial evaluating a set of three informational interventions for young people choosing high schools in 473 middle schools, serving over 115,000 8th graders. The interventions differed in their level of customization to the student and their mode of delivery (paper or online); all treated schools received identical materials to scaffold the decision-making process. Every intervention reduced likelihood of application to and enrollment in schools with graduation rates below the city median (75 percent). An important channel is their effect on reducing nonoptimal first choice application strategies. Providing a simplified, middle-school specific list of relatively high graduation rate schools had the largest impacts, causing students to enroll in high schools with 1.5-percentage point higher graduation rates. Providing the same information online, however, did not alter students’ choices or enrollment. This appears to be due to low utilization. Online interventions with individual customization, including a recommendation tool and search engine, induced students to enroll in high schools with 1-percentage point higher graduation rates, but with more variance in impact. Together, these results show that successful informational interventions must generate engagement with the material, and this is possible through multiple channels…(More)”.

We Still Can’t See American Slavery for What It Was


Jamelle Bouie at the New York Times: “…It is thanks to decades of painstaking, difficult work that we know a great deal about the scale of human trafficking across the Atlantic Ocean and about the people aboard each ship. Much of that research is available to the public in the form of the SlaveVoyages database. A detailed repository of information on individual ships, individual voyages and even individual people, it is a groundbreaking tool for scholars of slavery, the slave trade and the Atlantic world. And it continues to grow. Last year, the team behind SlaveVoyages introduced a new data set with information on the domestic slave trade within the United States, titled “Oceans of Kinfolk.”

The systematic effort to quantify the slave trade goes back at least as far as the 19th century…

Because of its specificity with regard to individual enslaved people, this new information is as pathbreaking for lay researchers and genealogists as it is for scholars and historians. It is also, for me, an opportunity to think about the difficult ethical questions that surround this work: How exactly do we relate to data that allows someone — anyone — to identify a specific enslaved person? How do we wield these powerful tools for quantitative analysis without abstracting the human reality away from the story? And what does it mean to study something as wicked and monstrous as the slave trade using some of the tools of the trade itself?…

“The data that we have about those ships is also kind of caught in a stranglehold of ship captains who care about some things and don’t care about others,” Jennifer Morgan said. We know what was important to them. It is the task of the historian to bring other resources to bear on this knowledge, to shed light on what the documents, and the data, might obscure.

“By merely reproducing the metrics of slave traders,” Fuentes said, “you’re not actually providing us with information about the people, the humans, who actually bore the brunt of this violence. And that’s important. It is important to humanize this history, to understand that this happened to African human beings.”

It’s here that we must engage with the question of the public. Work like the SlaveVoyages database exists in the “digital humanities,” a frequently public-facing realm of scholarship and inquiry. And within that context, an important part of respecting the humanity of the enslaved is thinking about their descendants.

“If you’re doing a digital humanities project, it exists in the world,” said Jessica Marie Johnson, an assistant professor of history at Johns Hopkins and the author of “Wicked Flesh: Black Women, Intimacy, and Freedom in the Atlantic World.” “It exists among a public that is beyond the academy and beyond Silicon Valley. And that means that there should be certain other questions that we ask, a different kind of ethics of care and a different morality that we bring to things.”…(More)”.

The Work of the Future: Building Better Jobs in an Age of Intelligent Machines


Book by By David Autor, David A. Mindell and Elisabeth B. Reynolds: “The United States has too many low-quality, low-wage jobs. Every country has its share, but those in the United States are especially poorly paid and often without benefits. Meanwhile, overall productivity increases steadily and new technology has transformed large parts of the economy, enhancing the skills and paychecks of higher-paid knowledge workers. What’s wrong with this picture? Why have so many workers benefited so little from decades of growth? The Work of the Future shows that technology is neither the problem nor the solution. We can build better jobs if we create institutions that leverage technological innovation and also support workers though long cycles of technological transformation.

Building on findings from the multiyear MIT Task Force on the Work of the Future, the book argues that we must foster institutional innovations that complement technological change. Skills programs that emphasize work-based and hybrid learning (in person and online), for example, empower workers to become and remain productive in a continuously evolving workplace. Industries fueled by new technology that augments workers can supply good jobs, and federal investment in R&D can help make these industries worker-friendly. We must act to ensure that the labor market of the future offers benefits, opportunity, and a measure of economic security to all….(More)”.

Automating the War on Noise Pollution


Article by Linda Poon: “Any city dweller is no stranger to the frequent revving of motorbikes and car engines, made all the more intolerable after the months of silence during pandemic lockdowns. Some cities have decided to take action. 

Paris police set up an anti-noise patrol in 2020 to ticket motorists whose vehicles exceed a certain decibel level, and soon, the city will start piloting the use of noise sensors in two neighborhoods. Called Medusa, each device uses four microphones to detect and measure noise levels, and two cameras to help authorities track down the culprit. No decibel threshold or fines will be set during the three-month trial period, according to French newspaper Liberation, but it’ll test the potentials and limits of automating the war on sound pollution.

Cities like Toronto and Philadelphia are also considering deploying similar tools. By now, research has been mounting about the health effects of continuous noise exposure, including links to high blood pressure and heart disease, and to poor mental health. And for years, many cities have been tackling noise through ordinances and urban design, including various bans on leaf blowers, on construction at certain hours and on cars. Some have even hired “night mayors” to, among other things, address complaints about after-hours noise.

But enforcement, even with the help of simple camera-and-noise radars, has been a challenge. Since 2018,  the Canadian city of Edmonton has been piloting the use of four radars attached to light poles at busy intersections in the downtown area. A 2021 report on the second phase of the project completed in 2020, found that officials had to manually sift through the data to take out noise made by, say, sirens. And the recordings didn’t always provide strong enough evidence against the offender in court. It was also costly: The pilot cost taxpayers $192,000, while fines generated a little more than half that amount, according to CTV News Edmonton.

Those obstacles have made noise pollution an increasingly popular target for smart city innovation, with companies and researchers looking to make environmental monitoring systems do more than just measure decibel levels…(More)”.

Counting Crimes: An Obsolete Paradigm


Paul Wormeli at The Criminologist: “To the extent that a paradigm is defined as the way we view things, the crime statistics paradigm in the United States is inadequate and requires reinvention….The statement—”not all crime is reported to the police”—lies at the very heart of why our current crime data are inherently incomplete. It is a direct reference to the fact that not all “street crime” is reported and that state and local law enforcement are not the only entities responsible for overseeing violations of societally established norms (“street crime” or otherwise). Two significant gaps exist, in that: 1) official reporting of crime from state and local law enforcement agencies cannot provide insight into unreported incidents, and 2) state and local law enforcement may not have or acknowledge jurisdiction over certain types of matters, such as cybercrime, corruption, environmental crime, or terrorism, and therefore cannot or do not report on those incidents…

All of these gaps in crime reporting mask the portrait of crime in the U.S. If there was a complete accounting of crime that could serve as the basis of policy formulation, including the distribution of federal funds to state and local agencies, there could be a substantial impact across the nation. Such a calculation would move the country toward a more rational basis for determining federal support for communities based on a comprehensive measure of community wellness.

In its deliberations, the NAS Panel recognized that it is essential to consider both the concepts of classification and the rules of counting as we seek a better and more practical path to describing crime in the U.S. and its consequences. The panel postulated that a meaningful classification of incidents found to be crimes would go beyond the traditional emphasis on street crime and include all crime categories.

The NAS study identified the missing elements of a national crime report as including more complete data on crimes involving drugrelated offenses, criminal acts where juveniles are involved, so-called white-collar crimes such as fraud and corruption, cybercrime, crime against businesses, environmental crimes, and crimes against animals. Just as one example, it is highly unlikely that we will know the full extent of fraudulent claims against all federal, state, and local governments in the face of the massive influx of funding from recent and forthcoming Congressional action.

In proposing a set of crime classifications, the NAS panel recommended 11 major categories, 5 of which are not addressed in our current crime data collection systems. While there are parallel data systems that collect some of the missing data within these five crime categories, it remains unclear which federal agency, if any, has the authority to gather the information and aggregate it to give us anywhere near a complete estimate of crime in the United States. No federal or national entity has the assignment of estimating the total amount of crime that takes place in the United States. Without such leadership, we are left with an uninformed understanding of the health and wellness of communities throughout the country…(More)”

How digital transformation is driving economic change


Blog (and book) by Zia Qureshi: “We are living in a time of exciting technological innovations. Digital technologies are driving transformative change. Economic paradigms are shifting. The new technologies are reshaping product and factor markets and profoundly altering business and work. The latest advances in artificial intelligence and related innovations are expanding the frontiers of the digital revolution. Digital transformation is accelerating in the wake of the COVID-19 pandemic. The future is arriving faster than expected.

A recently published book, “Shifting Paradigms: Growth, Finance, Jobs, and Inequality in the Digital Economy,” examines the implications of the unfolding digital metamorphosis for economies and public policy agendas….

Firms at the technological frontier have broken away from the rest, acquiring dominance in increasingly concentrated markets and capturing the lion’s share of the returns from the new technologies. While productivity growth in these firms has been strong, it has stagnated or slowed in other firms, depressing aggregate productivity growth. Increasing automation of low- to middle-skill tasks has shifted labor demand toward higher-level skills, hurting wages and jobs at the lower end of the skill spectrum. With the new technologies favoring capital, winner-take-all business outcomes, and higher-level skills, the distribution of both capital and labor income has tended to become more unequal, and income has been shifting from labor to capital.

One important reason for these outcomes is that policies and institutions have been slow to adjust to the unfolding transformations. To realize the promise of today’s smart machines, policies need to be smarter too. They must be more responsive to change to fully capture potential gains in productivity and economic growth and address rising inequality as technological disruptions create winners and losers.

As technology reshapes markets and alters growth and distributional dynamics, policies must ensure that markets remain inclusive and support wide access to the new opportunities for firms and workers. The digital economy must be broadened to disseminate new technologies and opportunities to smaller firms and wider segments of the labor force…(More)”.

Tech is finally killing long lines


Erica Pandey at Axios: “Startups and big corporations alike are releasing technology to put long lines online.

Why it matters: Standing in lines has always been a hassle, but the pandemic has made lines longer, slower and even dangerous. Now many of those lines are going virtual.

What’s happening: Physical lines are disappearing at theme parks, doctor’s offices, clothing stores and elsewhere, replaced by systems that let you book a slot online and then wait to be notified that it’s your turn.

Whyline, an Argentinian company that was just acquired by the biometric ID company CLEAR, is an app that lets users do just that — it will keep you up to date on your wait time and let you know when you need to show up.

  • Whyline’s list of clients — mostly in Latin America — includes banks, retail stores, the city of Lincoln, Nebraska, and Los Angeles International Airport.
  • “The same way you make a reservation at a restaurant, Whyline software does the waiting for you in banks, in DMVs, in airports,” CLEAR CEO Caryn Seidman-Becker said on CNBC.

Another app called Safe Queue was born from the pandemic and aims to make in-store shopping safer for customers and workers by spacing out shoppers’ visits.

  • The app uses GPS technology to detect when you’re within 1,000 feet of a participating store and automatically puts you in a virtual line. Then you can wait in your car or somewhere nearby until it’s your turn to shop.

Many health clinics around the country are also putting their COVID test lines online..

The rub: While virtual queuing tech may be gaining ground, lines are still more common than not. And in the age of social distancing, expect wait times to remain high and lines to remain long…(More)”.