10 learnings from considering AI Ethics through global perspectives


Blog by Sampriti Saxena and Stefaan G. Verhulst: “Artificial Intelligence (AI) technologies have the potential to solve the world’s biggest challenges. However, they also come with certain risks to individuals and groups. As these technologies become more prevalent around the world, we need to consider the ethical ramifications of AI use to identify and rectify potential harms. Equally, we need to consider the various associated issues from a global perspective, not assuming that a single approach will satisfy different cultural and societal expectations.

In February 2021, The Governance Lab (The GovLab), the NYU Tandon School of Engineering, the Global AI Ethics Consortium (GAIEC), the Center for Responsible AI @ NYU (R/AI), and the Technical University of Munich’s (TUM) Institute for Ethics in Artificial Intelligence (IEAI) launched AI Ethics: Global Perspectives. …A year and a half later, the course has grown to 38 modules, contributed by 40 faculty members representing over 20 countries. Our conversations with faculty members and our experiences with the course modules have yielded a wealth of knowledge about AI ethics. In keeping with the values of openness and transparency that underlie the course, we summarized these insights into ten learnings to share with a broader audience. In what follows, we outline our key lessons from experts around the world.

Our Ten Learnings:

  1. Broaden the Conversation
  2. The Public as a Stakeholder
  3. Centering Diversity and Inclusion in Ethics
  4. Building Effective Systems of Accountability
  5. Establishing Trust
  6. Ask the Right Questions
  7. The Role of Independent Research
  8. Humans at the Center
  9. Our Shared Responsibility
  10. The Challenge and Potential for a Global Framework…(More)”.

From Knowing to Doing: Operationalizing the 100 Questions for Air Quality Initiative


Report by Jessica Seddon, Stefaan G. Verhulst and Aimee Maron: “…summarizes the September 2021 capstone event that wrapped up 100 Questions for Air Quality, led by GovLab and World Resources Institute (WR). This initiative brought together a group of 100 atmospheric scientists, policy experts, academics and data providers from around the world to identify the most important questions for setting a new, high-impact agenda for further investments in data and data science. After a thorough process of sourcing questions, clustering and ranking them – the public was asked to vote. The results were surprising: the most important question was not about what new data or research is needed, but on how we do more with what we already know to generate political will and investments in air quality solutions.

Co-hosted by Clean Air Fund, Climate and Clean Air Coalition, and Clean Air Catalyst, the 2021 roundtable discussion focused on an answer to that question. This conference proceeding summary reflects early findings from that session and offers a starting point for a much-needed conversation on data-to-action. The group of experts and practitioners from academia, businesses, foundations, government, multilateral organizations, nonprofits, and think tanks have not been identified so they could speak freely….(More)”.

The Behavioral Economics Guide 2022


Editorial by Kathleen Vohs & Avni Shah: “This year’s Behavioral Economics Guide editorial reviews recent work in the areas of self-control and goals. To do so, we distilled the latest findings and advanced a set of guiding principles termed the FRESH framework: Fatigue, Reminders, Ease, Social influence, and Habits. Example findings reviewed include physicians giving out more prescriptions for opioids later in the workday compared to earlier (fatigue); the use of digital reminders to prompt people to re-engage with goals, such as for personal savings, from which they may have turned away (reminders); visual displays that give people data on their behavioral patterns so as to enable feedback and active monitoring (ease); the importance of geographically-local peers in changing behaviors such as residential water use (social influence); and digital and other tools that help people break the link between aspects of the environment and problematic behaviors (habits). We used the FRESH framework as a potential guide for thinking about the kinds of behaviors people can perform in achieving the goal of being environmental stewards of a more sustainable future…(More)”.

Technology is Not Neutral: A Short Guide to Technology Ethics


Book by Stephanie Hare: “It seems that just about every new technology that we bring to bear on improving our lives brings with it some downside, side effect or unintended consequence.

These issues can pose very real and growing ethical problems for all of us. For example, automated facial recognition can make life easier and safer for us – but it also poses huge issues with regard to privacy, ownership of data and even identity theft. How do we understand and frame these debates, and work out strategies at personal and governmental levels?

Technology Is Not Neutral: A Short Guide to Technology Ethics addresses one of today’s most pressing problems: how to create and use tools and technologies to maximize benefits and minimize harms? Drawing on the author’s experience as a technologist, political risk analyst and historian, the book offers a practical and cross-disciplinary approach that will inspire anyone creating, investing in or regulating technology, and it will empower all readers to better hold technology to account…(More)”.

New laws to strengthen Canadians’ privacy protection and trust in the digital economy


Press Release: “Canadians increasingly rely on digital technology to connect with loved ones, to work and to innovate. That’s why the Government of Canada is committed to making sure Canadians can benefit from the latest technologies, knowing that their personal information is safe and secure and that companies are acting responsibly.

Today, the Honourable François-Philippe Champagne, Minister of Innovation, Science and Industry, together with the Honourable David Lametti, Minister of Justice and Attorney General of Canada, introduced the Digital Charter Implementation Act, 2022, which will significantly strengthen Canada’s private sector privacy law, create new rules for the responsible development and use of artificial intelligence (AI), and continue advancing the implementation of Canada’s Digital Charter. As such, the Digital Charter Implementation Act, 2022 will include three proposed acts: the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.

The proposed Consumer Privacy Protection Act will address the needs of Canadians who rely on digital technology and respond to feedback received on previous proposed legislation. This law will ensure that the privacy of Canadians will be protected and that innovative businesses can benefit from clear rules as technology continues to evolve. This includes:

  • increasing control and transparency when Canadians’ personal information is handled by organizations;
  • giving Canadians the freedom to move their information from one organization to another in a secure manner;
  • ensuring that Canadians can request that their information be disposed of when it is no longer needed;
  • establishing stronger protections for minors, including by limiting organizations’ right to collect or use information on minors and holding organizations to a higher standard when handling minors’ information;
  • providing the Privacy Commissioner of Canada with broad order-making powers, including the ability to order a company to stop collecting data or using personal information; and
  • establishing significant fines for non-compliant organizations—with fines of up to 5% of global revenue or $25 million, whichever is greater, for the most serious offences.

The proposed Personal Information and Data Protection Tribunal Act will enable the creation of a new tribunal to facilitate the enforcement of the Consumer Privacy Protection Act. 

The proposed Artificial Intelligence and Data Act will introduce new rules to strengthen Canadians’ trust in the development and deployment of AI systems, including:

  • protecting Canadians by ensuring high-impact AI systems are developed and deployed in a way that identifies, assesses and mitigates the risks of harm and bias;
  • establishing an AI and Data Commissioner to support the Minister of Innovation, Science and Industry in fulfilling ministerial responsibilities under the Act, including by monitoring company compliance, ordering third-party audits, and sharing information with other regulators and enforcers as appropriate; and
  • outlining clear criminal prohibitions and penalties regarding the use of data obtained unlawfully for AI development or where the reckless deployment of AI poses serious harm and where there is fraudulent intent to cause substantial economic loss through its deployment…(More)”.

How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade Is Overturned


Article by Nicole Nguyen and Cordilia James: “You might not talk to your friends about your monthly cycle, but there’s a good chance you talk to an app about it. And why not? Period-tracking apps are more convenient than using a diary, and the insights are more interesting, too. 

But how much do you know about the ways apps and trackers collect, store—and sometimes share—your fertility and menstrual-cycle data?

The question has taken on new importance following the leak of a draft Supreme Court opinion that would overturn Roe v. Wade. Roe established a constitutional right to abortion, and should the court reverse its 1973 decision, about half the states in the U.S. are likely to restrict or outright ban the procedure.

Phone and app data have long been shared and sold without prominent disclosure, often for advertising purposes. HIPAA, aka the Health Insurance Portability and Accountability Act, might protect information shared between you and your healthcare provider, but it doesn’t typically apply to data you put into an app, even a health-related one. Flo Health Inc., maker of a popular period and ovulation tracker, settled with the Federal Trade Commission in 2021 for sharing sensitive health data with Facebook without making the practice clear to users.

The company completed an independent privacy audit earlier this year. “We remain committed to ensuring the utmost privacy for our users and want to make it clear that Flo does not share health data with any company,” a spokeswoman said.

In a scenario where Roe is overturned, your digital breadcrumbs—including the kind that come from period trackers—could be used against you in states where laws criminalize aiding in or undergoing abortion, say legal experts.

“The importance of menstrual data is not merely speculative. It has been relevant to the government before, in investigations and restrictions,” said Leah Fowler, research director at University of Houston’s Health Law and Policy Institute. She cited a 2019 hearing where Missouri’s state health department admitted to keeping a spreadsheet of Planned Parenthood abortion patients, which included the dates of their last menstrual period.

Prosecutors have also obtained other types of digital information, including text messages and search histories, as evidence for abortion-related cases…(More)”.

To Play Is the Thing: How Game Design Principles Can Make Online Deliberation Compelling


Paper by John Gastil: “This essay draws from game design to improve the prospects of democratic deliberation during government consultation with the public. The argument begins by reviewing the problem of low-quality deliberation in contemporary discourse, then explains how games can motivate participants to engage in demanding behaviors, such as deliberation. Key design features include: the origin, governance, and oversight of the game; the networked small groups at the center of the game; the objectives of these groups; the purpose of artificial intelligence and automated metrics for measuring deliberation; the roles played by public officials and nongovernmental organizations during the game; and the long-term payoff of playing the game for both its convenors and its participants. The essay concludes by considering this project’s wider theoretical significance for deliberative democracy, the first steps for governments and nonprofit organizations adopting this design, and the hazards of using advanced digital technology…(More)”.

Are blockchains decentralized?


Trail of Bits report: “Blockchains can help push the boundaries of current technology in useful ways. However, to make good risk decisions involving exciting and innovative technologies, people need demonstrable facts that are arrived at through reproducible methods and open data.

We believe the risks inherent in blockchains and cryptocurrencies have been poorly described and are often ignored—or even mocked—by those seeking to cash in on this decade’s gold rush.

In response to recent market turmoil and plummeting prices, proponents of cryptocurrency point to the technology’s fundamentals as sound. Are they?

Over the past year, Trail of Bits was engaged by the Defense Advanced Research Projects Agency (DARPA) to examine the fundamental properties of blockchains and the cybersecurity risks associated with them. DARPA wanted to understand those security assumptions and determine to what degree blockchains are actually decentralized.

To answer DARPA’s question, Trail of Bits researchers performed analyses and meta-analyses of prior academic work and of real-world findings that had never before been aggregated, updating prior research with new data in some cases. They also did novel work, building new tools and pursuing original research.

The resulting report is a 30-thousand-foot view of what’s currently known about blockchain technology. Whether these findings affect financial markets is out of the scope of the report: our work at Trail of Bits is entirely about understanding and mitigating security risk.

The report also contains links to the substantial supporting and analytical materials. Our findings are reproducible, and our research is open-source and freely distributable. So you can dig in for yourself.

Key findings

  • Blockchain immutability can be broken not by exploiting cryptographic vulnerabilities, but instead by subverting the properties of a blockchain’s implementations, networking, and consensus protocols. We show that a subset of participants can garner undue, centralized control over the entire system:
    • While the encryption used within cryptocurrencies is for all intents and purposes secure, it does not guarantee security, as touted by proponents.
    • Bitcoin traffic is unencrypted; any third party on the network route between nodes (e.g., internet service providers, Wi-Fi access point operators, or governments) can observe and choose to drop any messages they wish.
    • Tor is now the largest network provider in Bitcoin; just about 55% of Bitcoin nodes were addressable only via Tor (as of March 2022). A malicious Tor exit node can modify or drop traffic….(More)”

Smartphone apps in the COVID-19 pandemic


Paper by Jay A. Pandit, Jennifer M. Radin, Giorgio Quer & Eric J. Topol: “At the beginning of the COVID-19 pandemic, analog tools such as nasopharyngeal swabs for PCR tests were center stage and the major prevention tactics of masking and physical distancing were a throwback to the 1918 influenza pandemic. Overall, there has been scant regard for digital tools, particularly those based on smartphone apps, which is surprising given the ubiquity of smartphones across the globe. Smartphone apps, given accessibility in the time of physical distancing, were widely used for tracking, tracing and educating the public about COVID-19. Despite limitations, such as concerns around data privacy, data security, digital health illiteracy and structural inequities, there is ample evidence that apps are beneficial for understanding outbreak epidemiology, individual screening and contact tracing. While there were successes and failures in each category, outbreak epidemiology and individual screening were substantially enhanced by the reach of smartphone apps and accessory wearables. Continued use of apps within the digital infrastructure promises to provide an important tool for rigorous investigation of outcomes both in the ongoing outbreak and in future epidemics…(More)”.

Many researchers say they’ll share data — but don’t


Article by Clare Watson: “Most biomedical and health researchers who declare their willingness to share the data behind journal articles do not respond to access requests or hand over the data when asked, a study reports1.

Livia Puljak, who studies evidence-based medicine at the Catholic University of Croatia in Zagreb, and her colleagues analysed 3,556 biomedical and health science articles published in a month by 282 BMC journals. (BMC is part of Springer Nature, the publisher of Nature; Nature’s news team is editorially independent of its publisher.)

The team identified 381 articles with links to data stored in online repositories and another 1,792 papers for which the authors indicated in statements that their data sets would be available on reasonable request. The remaining studies stated that their data were in the published manuscript and its supplements, or generated no data, so sharing did not apply.

But of the 1,792 manuscripts for which the authors stated they were willing to share their data, more than 90% of corresponding authors either declined or did not respond to requests for raw data (see ‘Data-sharing behaviour’). Only 14%, or 254, of the contacted authors responded to e-mail requests for data, and a mere 6.7%, or 120 authors, actually handed over the data in a usable format. The study was published in the Journal of Clinical Epidemiology on 29 May.

DATA-SHARING BEHAVIOUR. Graphic showing percentage of authors that were willing to share data.
Source: Livia Puljak et al

Puljak was “flabbergasted” that so few researchers actually shared their data. “There is a gap between what people say and what people do,” she says. “Only when we ask for the data can we see their attitude towards data sharing.”

“It’s quite dismaying that [researchers] are not coming forward with the data,” says Rebecca Li, who is executive director of non-profit global data-sharing platform Vivli and is based in Cambridge, Massachusetts…(More)”.