Normalizing Health-Positive Technology


Article by By Sara J. Singer, Stephen Downs, Grace Ann Joseph, Neha Chaudhary, Christopher Gardner, Nina Hersher, Kelsey P. Mellard, Norma Padrón & Yennie Solheim: “….Aligning the technology sector with a societal goal of greater health and well-being entails a number of shifts in thinking. The most fundamental is understanding health not as a vertical market segment, but as a horizontal value: In addition to developing a line of health products or services, health should be expressed across a company’s full portfolio of products and services. Rather than pushing behaviors on people through information and feedback, technology companies should also pull behaviors from people by changing the environment and products they are offered; in addition to developing technology to help people overcome the challenge of being healthy, we need to envision technology that helps to reduce the challenges to being healthy. And in addition to holding individuals responsible for choices that they make, we also need to recognize the collective responsibility that society bears for the choices it makes available.

How to catalyze these shifts?

To find out, we convened a “tech-enabled health,” in which 50 entrepreneurs, leaders from large technology companies, investors, policymakers, clinicians, and public health experts designed a hands-on, interactive, and substantively focused agenda. Participants brainstormed ways that consumer-facing technologies could help people move more, eat better, sleep well, stay socially connected, and reduce stress. In groups and collectively, participants also considered ways in which ideas related and might be synergistic, potential barriers and contextual conditions that might impede or support transformation, and strategies for catalyzing the desired shift. Participants were mixed in terms of sector, discipline, and gender (though the attendees were not as diverse in terms of race/ethnicity or economic strata as the users we potentially wanted to impact—a limitation noted by participants). We intentionally maintained a positive tone, emphasizing potential benefits of shifting toward a health-positive approach, rather than bemoaning the negative role that technology can play….(More)”.

Defining a ‘new normal’ for data privacy in the wake of COVID-19


Jack Dunn at IAPP: “…It is revealing that our relationship with privacy is amorphous and requires additional context in light of transformative technologies, new economic realities and public health emergencies. How can we reasonably evaluate the costs and benefits of Google or Facebook sharing location data with the federal government when it has been perfectly legal for Walgreen’s to share access to customer data with pharmaceutical advertisers? How does aggregating and anonymizing data safeguard privacy when a user’s personal data can be revealed through other data points?

The pandemic is only revealing that we’ve yet to reach a consensus on privacy norms that will come to define the digital age. 

This isn’t the first time that technology confounded notions of privacy and consumer protection. In fact, the constitutional right to privacy was born out of another public health crisis. Before 1965, 32 women per 100,000 live births died while giving birth. Similarly, 25 infants died per 100,000 live births. As a result, medical professionals and women’s rights advocates began arguing for greater access to birth control. When state legislatures sought to minimize access, birth control advocates filed lawsuits that eventually lead to the Supreme Court’s seminal case regarding the right to privacy, Griswold v. Connecticut.

Today, there is growing public concern over the way in which consumer data is used to consolidate economic gain among the few while steering public perception among the many — particularly at a time when privacy seems to be the price for ending public health emergencies.

But the COVID-19 outbreak is also highlighting how user data has the capacity to improve consumer well being and public health. While strict adherence to traditional notions of privacy may be ineffectual in a time of exponential technological growth, the history of our relationship to privacy and technology suggests regulatory policies can strike a balance between otherwise competing interests….(More)“.

Tech Firms Are Spying on You. In a Pandemic, Governments Say That’s OK.


Sam Schechner, Kirsten Grind and Patience Haggin at the Wall Street Journal: “While an undergraduate at the University of Virginia, Joshua Anton created an app to prevent users from drunk dialing, which he called Drunk Mode. He later began harvesting huge amounts of user data from smartphones to resell to advertisers.

Now Mr. Anton’s company, called X-Mode Social Inc., is one of a number of little-known location-tracking companies that are being deployed in the effort to reopen the country. State and local authorities wielding the power to decide when and how to reopen are leaning on these vendors for the data to underpin those critical judgment calls.

In California, Gov. Gavin Newsom’s office used data from Foursquare Labs Inc. to figure out if beaches were getting too crowded; when the state discovered they were, it tightened its rules. In Denver, the Tri-County Health Department is monitoring counties where the population on average tends to stray more than 330 feet from home, using data from Cuebiq Inc.

Researchers at the University of Texas in San Antonio are using movement data from a variety of companies, including the geolocation firm SafeGraph, to guide city officials there on the best strategies for getting residents back to work.

Many of the location-tracking firms, data brokers and other middlemen are part of the ad-tech industry, which has come under increasing fire in recent years for building what critics call a surveillance economy. Data for targeting ads at individuals, including location information, can also end up in the hands of law-enforcement agencies or political groups, often with limited disclosure to users. Privacy laws are cropping up in states including California, along with calls for federal privacy legislation like that in the European Union.

But some public-health authorities are setting aside those concerns to fight an unprecedented pandemic. Officials are desperate for all types of data to identify people potentially infected with the virus and to understand how they are behaving to predict potential hot spots—whether those people realize it or not…(More)”

How to Sustain Your Activism Against Police Brutality Beyond this Moment


Article by Bethany Gordon: “…Despite the haunting nature of these details and the different features of this moment, I am worried that empathetic voices lifting up this cause will quiet too soon for lasting change to occur. But it doesn’t have to happen this way. Gaining a better understanding of the empathy we feel in these moments of awareness and advocacy can help us take a more behaviorally sustainable approach.

Empathy is a complex psychological phenomenon, describing eight distinct ways that we respond to one another’s experiences and emotions, but most commonly defined in the dictionary as “the ability to understand and share the feelings of another.” Using this broader definition, scholars and activists have debated how effective empathy is as a tool for behavior change—particularly when it comes to fighting racism. Paul Bloom argues that empathy allows our bias to drive our decision-making, bell hooks states that empathy is not a promising avenue to systemic racial change, and Alisha Gaines analyzes how an overemphasis on racial empathy in a 1944 landmark study, “An American Dilemma,” led to a blindness about the impact of systemic and institutional racial barriers. This more general understanding and application of empathy has not been an effective aid to fighting systemic oppression and has led to a lot of (well-meaning?) blackface.

A more nuanced understanding of empathy—and its related concepts—may help us use it more effectively in the fight against racism. There are two strains of empathy that are relevant to the George Floyd protests and can help us better understand (and possibly change) our response: empathic distress and empathic concern, also known as compassion.

Empathic distress is a type of empathy we feel when we are disturbed by witnessing another’s suffering. Empathic distress is an egocentric response—a reaction that places our own well-being at its center. When we’re motivated to act through empathic distress, our ultimate goal is to alleviate our own suffering. This may mean we take action to help another person, but it could also mean we distract ourselves from their suffering.

Compassion is a type of empathy that is other-oriented. Compassion operates when you feel for another person rather than being distressed by their suffering, thereby making your ultimate goal about fixing the actual problem….(More)’

IoT Security Is a Mess. Privacy ‘Nutrition’ Labels Could Help


Lily Hay Newman at Wired: “…Given that IoT security seems unlikely to magically improve anytime soon, researchers and regulators are rallying behind a new approach to managing IoT risk. Think of it as nutrition labels for embedded devices.

At the IEEE Symposium on Security & Privacy last month, researchers from Carnegie Mellon University presented a prototype security and privacy label they created based on interviews and surveys of people who own IoT devices, as well as privacy and security experts. They also published a tool for generating their labels. The idea is to shed light on a device’s security posture but also explain how it manages user data and what privacy controls it has. For example, the labels highlight whether a device can get security updates and how long a company has pledged to support it, as well as the types of sensors present, the data they collect, and whether the company shares that data with third parties.

“In an IoT setting, the amount of sensors and information you have about users is potentially invasive and ubiquitous,” says Yuvraj Agarwal, a networking and embedded systems researcher who worked on the project. “It’s like trying to fix a leaky bucket. So transparency is the most important part. This work shows and enumerates all the choices and factors for consumers.”

Nutrition labels on packaged foods have a certain amount of standardization around the world, but they’re still more opaque than they could be. And security and privacy issues are even less intuitive to most people than soluble and insoluble fiber. So the CMU researchers focused a lot of their efforts on making their IoT label as transparent and accessible as possible. To that end, they included both a primary and secondary layer to the label. The primary label is what would be printed on device boxes. To access the secondary label, you could follow a URL or scan a QR code to see more granular information about a device….(More)”.

How Data-Driven Cities Respond Swiftly and Effectively to COVID-19


Blog Post by Jennifer Park, Lauren Su, Lisa Fiedler, and Madeleine Weatherhead: “Since January of this year, the novel coronavirus has swept rapidly throughout the United States, leaving no city untouched. To contain the virus’ spread and protect residents’ health and livelihoods, local leaders have had to act swiftly and decisively. It is a challenge in scope and scale unlike any other in recent history — and it has underscored the power of data to guide life-and-death decisions and build trust.

Take, for example, Los Angeles. As cities across the country began issuing states of emergency and acting to promote public health, Mayor Eric Garcetti quickly identified the city’s response priorities: supporting families, small businesses, healthcare workers, and unhoused Angelenos, and increasing the healthcare equipment and testing kits available for the city. Mayor Garcetti tapped his Chief Information Officer and Innovation Team to collect and analyze data, to inform decisions, and share real-time information publicly.

A snapshot of Los Angeles’ publicly shared data from one of the city’s daily COVID-19 summary briefings. Image courtesy of the City of Los Angeles’ Innovation Team.

The Mayor was soon conducting daily briefings, updating the public on the latest virus-related data and informing city residents about various decisions made by the city — from pausing parking rules enforcement to opening thousands of temporary shelter beds. He used data to justify key decisions, linking stay-at-home orders to a decrease in COVID-19 cases from week to week.

Los Angeles’ swift response built on an existing culture of leveraging data to set goals, make decisions, and communicate with the public. Its leaders are now seeing the positive impact of having invested in foundational data capacity — regular tracking of cases, hospital capacity, and infection rates have proven to be vital to helping and accelerating the city’s responses to COVID-19.

Other cities, too, have leaned on established data practices and infrastructure in their response efforts, both to the benefit of their residents and to lay a stronger foundation to guide recovery….(More)“.

How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly)


Reflection Document by The GovLab: “Racism is a systemic issue that pervades every aspect of life in the United States and around the world. In recent months, its corrosive influence has been made starkly visible, especially on Black people. Many people are hurting. Their rage and suffering stem from centuries of exclusion and from being subject to repeated bias and violence. Across the country, there have been protests decrying racial injustice. Activists have called upon the government to condemn bigotry and racism, to act against injustice, to address systemic and growing inequality.

Institutions need to take meaningful action to address such demands. Though racism is not experienced in the same way by all communities of color, policymakers must respond to the anxieties and apprehensions of Black people as well as those of communities of color more generally. This work will require institutions and individuals to reflect on how they may be complicit in perpetuating structural and systematic inequalities and harm and to ask better questions about the inequities that exist in society (laid bare in both recent acts of violence and in racial disadvantages in health outcomes during the ongoing COVID-19 crisis). This work is necessary but unlikely to be easy. As Rashida Richardson, Director of Policy Research at the AI Now Institute at NYU notes:

“Social and political stratifications also persist and worsen because they are embedded into our social and legal systems and structures. Thus, it is difficult for most people to see and understand how bias and inequalities have been automated or operationalized over time.”

We believe progress can be made, at least in part, through responsible data access and analysis, including increased availability of (disaggregated) data through data collaboration. Of course, data is only one part of the overall picture, and we make no claims that data alone can solve such deeply entrenched problems. Nonetheless, data can have an impact by making inequalities resulting from racism more quantifiable and inaction less excusable.

…Prioritizing any of these topics will also require increased community engagement and participatory agenda setting. Likewise, we are deeply conscious that data can have a negative as well as positive impact and that technology can perpetuate racism when designed and implemented without the input and participation of minority communities and organizations. While our report here focuses on the promise of data, we need to remain aware of the potential to weaponize data against vulnerable and already disenfranchised communities. In addition, (hidden) biases in data collected and used in AI algorithms, as well as in a host of other areas across the data life cycle, will only exacerbate racial inequalities if not addressed….(More)”

ALSO: The piece is supplemented by a crowdsourced listing of Data-Driven Efforts to Address Racial Inequality.

Fear of a Black and Brown Internet: Policing Online Activism


Paper by Sahar F. Aziz and Khaled A. Beydoun: “Virtual surveillance is the modern extension of established policing models that tie dissident Muslim advocacy to terror suspicion and Black activism to political subversion. Countering Violent Extremism (“CVE”) and Black Identity Extremism (“BIE”) programs that specifically target Muslim and Black populations are shifting from on the ground to online.

Law enforcement exploits social media platforms — where activism and advocacy is robust — to monitor and crack down on activists. In short, the new policing is the old policing, but it is stealthily morphing and moving onto virtual platforms where activism is fluidly unfolding in real time. This Article examines how the law’s failure to keep up with technological advancements in social media poses serious risks to the ability of minority communities to mobilize against racial and religious injustice….(More)”.

Terms of Disservice: How Silicon Valley is Destructive by Design


Book by Dipayan Ghosh on “Designing a new digital social contact for our technological future…High technology presents a paradox. In just a few decades, it has transformed the world, making almost limitless quantities of information instantly available to billions of people and reshaping businesses, institutions, and even entire economies. But it also has come to rule our lives, addicting many of us to the march of megapixels across electronic screens both large and small.

Despite its undeniable value, technology is exacerbating deep social and political divisions in many societies. Elections influenced by fake news and unscrupulous hidden actors, the cyber-hacking of trusted national institutions, the vacuuming of private information by Silicon Valley behemoths, ongoing threats to vital infrastructure from terrorist groups and even foreign governments—all these concerns are now part of the daily news cycle and are certain to become increasingly serious into the future.

In this new world of endless technology, how can individuals, institutions, and governments harness its positive contributions while protecting each of us, no matter who or where we are?

In this book, a former Facebook public policy adviser who went on to assist President Obama in the White House offers practical ideas for using technology to create an open and accessible world that protects all consumers and civilians. As a computer scientist turned policymaker, Dipayan Ghosh answers the biggest questions about technology facing the world today. Proving clear and understandable explanations for complex issues, Terms of Disservice will guide industry leaders, policymakers, and the general public as we think about how we ensure that the Internet works for everyone, not just Silicon Valley….(More)”.

Toward Inclusive Urban Technology


Report by Denise Linn Riedl: “Our cities are changing at an incredible pace. The technology being deployed on our sidewalks and streetlights has the potential to improve mobility, sustainability, connectivity, and city services.

Public value and public inclusion in this change, however, are not inevitable. Depending on how these technologies are deployed, they have the potential to increase inequities and distrust as much as they can create responsive government services.

Recognizing this tension, an initial coalition of local practitioners began collaborating in 2019 with the support of the Benton Institute for Broadband & Society. We combined knowledge of and personal experience with local governments to tackle a common question: What does procedural justice look like when cities deploy new technology?

This guide is meant for any local worker—inside or outside of government—who is helping to plan or implement technological change in their community. It’s a collection of experiences, cases, and best practices that we hope will be valuable and will make projects stronger, more sustainable, and more inclusive….(More)”.