Don’t Fear the Robots, and Other Lessons From a Study of the Digital Economy


Steve Lohr at the New York Times: “L. Rafael Reif, the president of Massachusetts Institute of Technology, delivered an intellectual call to arms to the university’s faculty in November 2017: Help generate insights into how advancing technology has changed and will change the work force, and what policies would create opportunity for more Americans in the digital economy.

That issue, he wrote, is the “defining challenge of our time.”

Three years later, the task force assembled to address it is publishing its wide-ranging conclusions. The 92-page report, “The Work of the Future: Building Better Jobs in an Age of Intelligent Machines,” was released on Tuesday….

Here are four of the key findings in the report:

Most American workers have fared poorly.

It’s well known that those on the top rungs of the job ladder have prospered for decades while wages for average American workers have stagnated. But the M.I.T. analysis goes further. It found, for example, that real wages for men without four-year college degrees have declined 10 to 20 percent since their peak in 1980….

Robots and A.I. are not about to deliver a jobless future.

…The M.I.T. researchers concluded that the change would be more evolutionary than revolutionary. In fact, they wrote, “we anticipate that in the next two decades, industrialized countries will have more job openings than workers to fill them.”…

Worker training in America needs to match the market.

“The key ingredient for success is public-private partnerships,” said Annette Parker, president of South Central College, a community college in Minnesota, and a member of the advisory board to the M.I.T. project.

The schools, nonprofits and corporate-sponsored programs that have succeeded in lifting people into middle-class jobs all echo her point: the need to link skills training to business demand….

Workers need more power, voice and representation.The report calls for raising the minimum wage, broadening unemployment insurance and modifying labor laws to enable collective bargaining in occupations like domestic and home-care workers and freelance workers. Such representation, the report notes, could come from traditional unions or worker advocacy groups like the National Domestic Workers Alliance, Jobs With Justice and the Freelancers Union….(More)”

For the Win


Revised and Updated Book by Kevin Werbach and Dan Hunter on “The Power of Gamification and Game Thinking in Business, Education, Government, and Social Impact”: “For thousands of years, we’ve created things called games that tap the tremendous psychic power of fun. In a revised and updated edition of For the Win: The Power of Gamification and Game Thinking in Business, Education, Government, and Social Impact, authors Kevin Werbach and Dan Hunter argue that applying the lessons of gamification could change your business, the way you learn or teach, and even your life.

Werbach and Hunter explain how games can be used as a valuable tool to address serious pursuits like marketing, productivity enhancement, education, innovation, customer engagement, human resources, and sustainability. They reveal how, why, and when gamification works—and what not to do.

Discover the successes—and failures—of organizations that are using gamification:

  • How a South Korean company called Neofect is using gamification to help people recover from strokes;
  • How a tool called SuperBetter has demonstrated significant results treating depression, concussion symptoms, and the mental health harms of the COVID-19 pandemic through game thinking;
  • How the ride-hailing giant Uber once used gamification to influence their drivers to work longer hours than they otherwise wanted to, causing swift backlash.

The story of gamification isn’t fun and games by any means. It’s serious. When used carefully and thoughtfully, gamification produces great outcomes for users, in ways that are hard to replicate through other methods. Other times, companies misuse the “guided missile” of gamification to have people work and do things in ways that are against their self-interest.

This revised and updated edition incorporates the most prominent research findings to provide a comprehensive gamification playbook for the real world….(More)”.

Remaking the Commons: How Digital Tools Facilitate and Subvert the Common Good


Paper by Jessica Feldman:”This scoping paper considers how digital tools, such as ICTs and AI, have failed to contribute to the “common good” in any sustained or scalable way. This is attributed to a problem that is at once political-economic and technical.

Many digital tools’ business models are predicated on advertising: framing the user as an individual consumer-to-be-targeted, not as an organization, movement, or any sort of commons. At the level of infrastructure and hardware, the increased privatization and centralization of transmission and production leads to a dangerous bottlenecking of communication power, and to labor and production practices that are undemocratic and damaging to common resources.

These practices escalate collective action problems, pose a threat to democratic decision making, aggravate issues of economic and labor inequality, and harm the environment and health. At the same time, the growth of both AI and online community formation raise questions around the very definition of human subjectivity and modes of relationality. Based on an operational definition of the common good grounded in ethics of care, sustainability, and redistributive justice, suggestions are made for solutions and further research in the areas of participatory design, digital democracy, digital labor, and environmental sustainability….(More)”

Leveraging Open Data with a National Open Computing Strategy


Policy Brief by Lara Mangravite and John Wilbanks: “Open data mandates and investments in public data resources, such as the Human Genome Project or the U.S. National Oceanic and Atmospheric Administration Data Discovery Portal, have provided essential data sets at a scale not possible without government support. By responsibly sharing data for wide reuse, federal policy can spur innovation inside the academy and in citizen science communities. These approaches are enabled by private-sector advances in cloud computing services and the government has benefited from innovation in this domain. However, the use of commercial products to manage the storage of and access to public data resources poses several challenges.

First, too many cloud computing systems fail to properly secure data against breaches, improperly share copies of data with other vendors, or use data to add to their own secretive and proprietary models. As a result, the public does not trust technology companies to responsibly manage public data—particularly private data of individual citizens. These fears are exacerbated by the market power of the major cloud computing providers, which may limit the ability of individuals or institutions to negotiate appropriate terms. This impacts the willingness of U.S. citizens to have their personal information included within these databases.

Second, open data solutions are springing up across multiple sectors without coordination. The federal government is funding a series of independent programs that are working to solve the same problem, leading to a costly duplication of effort across programs.

Third and most importantly, the high costs of data storage, transfer, and analysis preclude many academics, scientists, and researchers from taking advantage of governmental open data resources. Cloud computing has radically lowered the costs of high-performance computing, but it is still not free. The cost of building the wrong model at the wrong time can quickly run into tens of thousands of dollars.

Scarce resources mean that many academic data scientists are unable or unwilling to spend their limited funds to reuse data in exploratory analyses outside their narrow projects. And citizen scientists must use personal funds, which are especially scarce in communities traditionally underrepresented in research. The vast majority of public data made available through existing open science policy is therefore left unused, either as reference material or as “foreground” for new hypotheses and discoveries….The Solution: Public Cloud Computing…(More)”.

Evaluating Identity Disclosure Risk in Fully Synthetic Health Data: Model Development and Validation


Paper by Khaled El Emam et al: “There has been growing interest in data synthesis for enabling the sharing of data for secondary analysis; however, there is a need for a comprehensive privacy risk model for fully synthetic data: If the generative models have been overfit, then it is possible to identify individuals from synthetic data and learn something new about them.

Objective: The purpose of this study is to develop and apply a methodology for evaluating the identity disclosure risks of fully synthetic data.

Methods: A full risk model is presented, which evaluates both identity disclosure and the ability of an adversary to learn something new if there is a match between a synthetic record and a real person. We term this “meaningful identity disclosure risk.” The model is applied on samples from the Washington State Hospital discharge database (2007) and the Canadian COVID-19 cases database. Both of these datasets were synthesized using a sequential decision tree process commonly used to synthesize health and social science data.

Results: The meaningful identity disclosure risk for both of these synthesized samples was below the commonly used 0.09 risk threshold (0.0198 and 0.0086, respectively), and 4 times and 5 times lower than the risk values for the original datasets, respectively.

Conclusions: We have presented a comprehensive identity disclosure risk model for fully synthetic data. The results for this synthesis method on 2 datasets demonstrate that synthesis can reduce meaningful identity disclosure risks considerably. The risk model can be applied in the future to evaluate the privacy of fully synthetic data….(More)”.

Federal Regulators Increase Focus on Patient Risks From Electronic Health Records


Ben Moscovitch at Pew: “…The Office of the National Coordinator for Health Information Technology (ONC) will collect clinicians’ feedback through a survey developed by the Urban Institute under a contract with the agency. ONC will release aggregated results as part its EHR reporting program. Congress required the program’s creation in the 21st Century Cures Act, the wide-ranging federal health legislation enacted in 2016. The act directs ONC to determine which data to gather from health information technology vendors. That information can then be used to illuminate the strengths and weaknesses of EHR products, as well as industry trends.

The Pew Charitable Trusts, major medical organizations and hospital groups, and health information technology experts have urged that the reporting program examine usability-related patient risks. Confusing, cumbersome, and poorly customized EHR systems can cause health care providers to order the wrong drug or miss test results and other information critical to safe, effective treatment. Usability challenges also can increase providers’ frustration and, in turn, their likelihood of making mistakes.

The data collected from clinicians will shed light on these problems, encourage developers to improve the safety of their products, and help hospitals and doctor’s offices make better-informed decisions about the purchase, implementation, and use of these tools. Research shows that aggregated data about EHRs can generate product-specific insights about safety deficiencies, even when health care facilities implement the same system in distinct ways….(More)”.

How the U.S. Military Buys Location Data from Ordinary Apps


Joseph Cox at Vice: “The U.S. military is buying the granular movement data of people around the world, harvested from innocuous-seeming apps, Motherboard has learned. The most popular app among a group Motherboard analyzed connected to this sort of data sale is a Muslim prayer and Quran app that has more than 98 million downloads worldwide. Others include a Muslim dating app, a popular Craigslist app, an app for following storms, and a “level” app that can be used to help, for example, install shelves in a bedroom.

Through public records, interviews with developers, and technical analysis, Motherboard uncovered two separate, parallel data streams that the U.S. military uses, or has used, to obtain location data. One relies on a company called Babel Street, which creates a product called Locate X. U.S. Special Operations Command (USSOCOM), a branch of the military tasked with counterterrorism, counterinsurgency, and special reconnaissance, bought access to Locate X to assist on overseas special forces operations. The other stream is through a company called X-Mode, which obtains location data directly from apps, then sells that data to contractors, and by extension, the military.

The news highlights the opaque location data industry and the fact that the U.S. military, which has infamously used other location data to target drone strikes, is purchasing access to sensitive data. Many of the users of apps involved in the data supply chain are Muslim, which is notable considering that the United States has waged a decades-long war on predominantly Muslim terror groups in the Middle East, and has killed hundreds of thousands of civilians during its military operations in Pakistan, Afghanistan, and Iraq. Motherboard does not know of any specific operations in which this type of app-based location data has been used by the U.S. military.

The apps sending data to X-Mode include Muslim Pro, an app that reminds users when to pray and what direction Mecca is in relation to the user’s current location. The app has been downloaded over 50 million times on Android, according to the Google Play Store, and over 98 million in total across other platforms including iOS, according to Muslim Pro’s website….(More)”.

Building Trust for Inter-Organizational Data Sharing: The Case of the MLDE


Paper by Heather McKay, Sara Haviland, and Suzanne Michael: “There is increasing interest in sharing data across agencies and even between states that was once siloed in separate agencies. Driving this is a need to better understand how people experience education and work, and their pathways through each. A data-sharing approach offers many possible advantages, allowing states to leverage pre-existing data systems to conduct increasingly sophisticated and complete analyses. However, information sharing across state organizations presents a series of complex challenges, one of which is the central role trust plays in building successful data-sharing systems. Trust building between organizations is therefore crucial to ensuring project success.

This brief examines the process of building trust within the context of the development and implementation of the Multistate Longitudinal Data Exchange (MLDE). The brief is based on research and evaluation activities conducted by Rutgers’ Education & Employment Research Center (EERC) over the past five years, which included 40 interviews with state leaders and the Western Interstate Commission for Higher Education (WICHE) staff, observations of user group meetings, surveys, and MLDE document analysis. It is one in a series of MLDE briefs developed by EERC….(More)”.

unBail


About: “The criminal legal system is a maze of laws, language, and unwritten rules that lawyers are trained to maneuver to represent defendants.

However, according to the Bureau of Justice Statistics, only 27% of county public defender’s offices meet national caseload recommendations for cases per attorney, meaning that most public defenders are overworked, leaving their clients underrepresented.

Defendants must complete an estimated 200 discrete tasks during their legal proceeding. This leaves them overwhelmed, lost, and profoundly disadvantaged while attempting to navigate the system….

We have… created a product that acts as the trusted advisor for defendants and their families as they navigate the criminal legal system. We aim to deliver valuable and relevant legal information (but not legal advice) to the user in plain language, empowering them to advocate for themselves and proactively plan for the future and access social services if necessary. The user is also encouraged to give feedback on their experience at each step of the process in the hope that this can be used to improve the system….(More)”

The Work of the Future: Shaping Technology and Institutions


Report by David Autor, David Mindell and Elisabeth Reynolds for the MIT Future of Work Task Force: “The world now stands on the cusp of a technological revolution in artificial intelligence and robotics that may prove as transformative for economic growth and human potential as were electrification, mass production, and electronic telecommunications in their eras. New and emerging technologies will raise aggregate economic output and boost the wealth of nations. Will these developments enable people to attain higher living standards, better working conditions, greater economic security, and improved health and longevity? The answers to these questions are not predetermined. They depend upon the institutions, investments, and policies that we deploy to harness the opportunities and confront the challenges posed by this new era.

How can we move beyond unhelpful prognostications about the supposed end of work and toward insights that will enable policymakers, businesses, and people to better navigate the disruptions that are coming and underway? What lessons should we take from previous epochs of rapid technological change? How is it different this time? And how can we strengthen institutions, make investments, and forge policies to ensure that the labor market of the 21st century enables workers to contribute and succeed?

To help answer these questions, and to provide a framework for the Task Force’s efforts over the next year, this report examines several aspects of the interaction between work and technology….(More)”.