Technology is Not Neutral: A Short Guide to Technology Ethics


Book by Stephanie Hare: “It seems that just about every new technology that we bring to bear on improving our lives brings with it some downside, side effect or unintended consequence.

These issues can pose very real and growing ethical problems for all of us. For example, automated facial recognition can make life easier and safer for us – but it also poses huge issues with regard to privacy, ownership of data and even identity theft. How do we understand and frame these debates, and work out strategies at personal and governmental levels?

Technology Is Not Neutral: A Short Guide to Technology Ethics addresses one of today’s most pressing problems: how to create and use tools and technologies to maximize benefits and minimize harms? Drawing on the author’s experience as a technologist, political risk analyst and historian, the book offers a practical and cross-disciplinary approach that will inspire anyone creating, investing in or regulating technology, and it will empower all readers to better hold technology to account…(More)”.

New laws to strengthen Canadians’ privacy protection and trust in the digital economy


Press Release: “Canadians increasingly rely on digital technology to connect with loved ones, to work and to innovate. That’s why the Government of Canada is committed to making sure Canadians can benefit from the latest technologies, knowing that their personal information is safe and secure and that companies are acting responsibly.

Today, the Honourable François-Philippe Champagne, Minister of Innovation, Science and Industry, together with the Honourable David Lametti, Minister of Justice and Attorney General of Canada, introduced the Digital Charter Implementation Act, 2022, which will significantly strengthen Canada’s private sector privacy law, create new rules for the responsible development and use of artificial intelligence (AI), and continue advancing the implementation of Canada’s Digital Charter. As such, the Digital Charter Implementation Act, 2022 will include three proposed acts: the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.

The proposed Consumer Privacy Protection Act will address the needs of Canadians who rely on digital technology and respond to feedback received on previous proposed legislation. This law will ensure that the privacy of Canadians will be protected and that innovative businesses can benefit from clear rules as technology continues to evolve. This includes:

  • increasing control and transparency when Canadians’ personal information is handled by organizations;
  • giving Canadians the freedom to move their information from one organization to another in a secure manner;
  • ensuring that Canadians can request that their information be disposed of when it is no longer needed;
  • establishing stronger protections for minors, including by limiting organizations’ right to collect or use information on minors and holding organizations to a higher standard when handling minors’ information;
  • providing the Privacy Commissioner of Canada with broad order-making powers, including the ability to order a company to stop collecting data or using personal information; and
  • establishing significant fines for non-compliant organizations—with fines of up to 5% of global revenue or $25 million, whichever is greater, for the most serious offences.

The proposed Personal Information and Data Protection Tribunal Act will enable the creation of a new tribunal to facilitate the enforcement of the Consumer Privacy Protection Act. 

The proposed Artificial Intelligence and Data Act will introduce new rules to strengthen Canadians’ trust in the development and deployment of AI systems, including:

  • protecting Canadians by ensuring high-impact AI systems are developed and deployed in a way that identifies, assesses and mitigates the risks of harm and bias;
  • establishing an AI and Data Commissioner to support the Minister of Innovation, Science and Industry in fulfilling ministerial responsibilities under the Act, including by monitoring company compliance, ordering third-party audits, and sharing information with other regulators and enforcers as appropriate; and
  • outlining clear criminal prohibitions and penalties regarding the use of data obtained unlawfully for AI development or where the reckless deployment of AI poses serious harm and where there is fraudulent intent to cause substantial economic loss through its deployment…(More)”.

How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade Is Overturned


Article by Nicole Nguyen and Cordilia James: “You might not talk to your friends about your monthly cycle, but there’s a good chance you talk to an app about it. And why not? Period-tracking apps are more convenient than using a diary, and the insights are more interesting, too. 

But how much do you know about the ways apps and trackers collect, store—and sometimes share—your fertility and menstrual-cycle data?

The question has taken on new importance following the leak of a draft Supreme Court opinion that would overturn Roe v. Wade. Roe established a constitutional right to abortion, and should the court reverse its 1973 decision, about half the states in the U.S. are likely to restrict or outright ban the procedure.

Phone and app data have long been shared and sold without prominent disclosure, often for advertising purposes. HIPAA, aka the Health Insurance Portability and Accountability Act, might protect information shared between you and your healthcare provider, but it doesn’t typically apply to data you put into an app, even a health-related one. Flo Health Inc., maker of a popular period and ovulation tracker, settled with the Federal Trade Commission in 2021 for sharing sensitive health data with Facebook without making the practice clear to users.

The company completed an independent privacy audit earlier this year. “We remain committed to ensuring the utmost privacy for our users and want to make it clear that Flo does not share health data with any company,” a spokeswoman said.

In a scenario where Roe is overturned, your digital breadcrumbs—including the kind that come from period trackers—could be used against you in states where laws criminalize aiding in or undergoing abortion, say legal experts.

“The importance of menstrual data is not merely speculative. It has been relevant to the government before, in investigations and restrictions,” said Leah Fowler, research director at University of Houston’s Health Law and Policy Institute. She cited a 2019 hearing where Missouri’s state health department admitted to keeping a spreadsheet of Planned Parenthood abortion patients, which included the dates of their last menstrual period.

Prosecutors have also obtained other types of digital information, including text messages and search histories, as evidence for abortion-related cases…(More)”.

To Play Is the Thing: How Game Design Principles Can Make Online Deliberation Compelling


Paper by John Gastil: “This essay draws from game design to improve the prospects of democratic deliberation during government consultation with the public. The argument begins by reviewing the problem of low-quality deliberation in contemporary discourse, then explains how games can motivate participants to engage in demanding behaviors, such as deliberation. Key design features include: the origin, governance, and oversight of the game; the networked small groups at the center of the game; the objectives of these groups; the purpose of artificial intelligence and automated metrics for measuring deliberation; the roles played by public officials and nongovernmental organizations during the game; and the long-term payoff of playing the game for both its convenors and its participants. The essay concludes by considering this project’s wider theoretical significance for deliberative democracy, the first steps for governments and nonprofit organizations adopting this design, and the hazards of using advanced digital technology…(More)”.

Are blockchains decentralized?


Trail of Bits report: “Blockchains can help push the boundaries of current technology in useful ways. However, to make good risk decisions involving exciting and innovative technologies, people need demonstrable facts that are arrived at through reproducible methods and open data.

We believe the risks inherent in blockchains and cryptocurrencies have been poorly described and are often ignored—or even mocked—by those seeking to cash in on this decade’s gold rush.

In response to recent market turmoil and plummeting prices, proponents of cryptocurrency point to the technology’s fundamentals as sound. Are they?

Over the past year, Trail of Bits was engaged by the Defense Advanced Research Projects Agency (DARPA) to examine the fundamental properties of blockchains and the cybersecurity risks associated with them. DARPA wanted to understand those security assumptions and determine to what degree blockchains are actually decentralized.

To answer DARPA’s question, Trail of Bits researchers performed analyses and meta-analyses of prior academic work and of real-world findings that had never before been aggregated, updating prior research with new data in some cases. They also did novel work, building new tools and pursuing original research.

The resulting report is a 30-thousand-foot view of what’s currently known about blockchain technology. Whether these findings affect financial markets is out of the scope of the report: our work at Trail of Bits is entirely about understanding and mitigating security risk.

The report also contains links to the substantial supporting and analytical materials. Our findings are reproducible, and our research is open-source and freely distributable. So you can dig in for yourself.

Key findings

  • Blockchain immutability can be broken not by exploiting cryptographic vulnerabilities, but instead by subverting the properties of a blockchain’s implementations, networking, and consensus protocols. We show that a subset of participants can garner undue, centralized control over the entire system:
    • While the encryption used within cryptocurrencies is for all intents and purposes secure, it does not guarantee security, as touted by proponents.
    • Bitcoin traffic is unencrypted; any third party on the network route between nodes (e.g., internet service providers, Wi-Fi access point operators, or governments) can observe and choose to drop any messages they wish.
    • Tor is now the largest network provider in Bitcoin; just about 55% of Bitcoin nodes were addressable only via Tor (as of March 2022). A malicious Tor exit node can modify or drop traffic….(More)”

Smartphone apps in the COVID-19 pandemic


Paper by Jay A. Pandit, Jennifer M. Radin, Giorgio Quer & Eric J. Topol: “At the beginning of the COVID-19 pandemic, analog tools such as nasopharyngeal swabs for PCR tests were center stage and the major prevention tactics of masking and physical distancing were a throwback to the 1918 influenza pandemic. Overall, there has been scant regard for digital tools, particularly those based on smartphone apps, which is surprising given the ubiquity of smartphones across the globe. Smartphone apps, given accessibility in the time of physical distancing, were widely used for tracking, tracing and educating the public about COVID-19. Despite limitations, such as concerns around data privacy, data security, digital health illiteracy and structural inequities, there is ample evidence that apps are beneficial for understanding outbreak epidemiology, individual screening and contact tracing. While there were successes and failures in each category, outbreak epidemiology and individual screening were substantially enhanced by the reach of smartphone apps and accessory wearables. Continued use of apps within the digital infrastructure promises to provide an important tool for rigorous investigation of outcomes both in the ongoing outbreak and in future epidemics…(More)”.

Many researchers say they’ll share data — but don’t


Article by Clare Watson: “Most biomedical and health researchers who declare their willingness to share the data behind journal articles do not respond to access requests or hand over the data when asked, a study reports1.

Livia Puljak, who studies evidence-based medicine at the Catholic University of Croatia in Zagreb, and her colleagues analysed 3,556 biomedical and health science articles published in a month by 282 BMC journals. (BMC is part of Springer Nature, the publisher of Nature; Nature’s news team is editorially independent of its publisher.)

The team identified 381 articles with links to data stored in online repositories and another 1,792 papers for which the authors indicated in statements that their data sets would be available on reasonable request. The remaining studies stated that their data were in the published manuscript and its supplements, or generated no data, so sharing did not apply.

But of the 1,792 manuscripts for which the authors stated they were willing to share their data, more than 90% of corresponding authors either declined or did not respond to requests for raw data (see ‘Data-sharing behaviour’). Only 14%, or 254, of the contacted authors responded to e-mail requests for data, and a mere 6.7%, or 120 authors, actually handed over the data in a usable format. The study was published in the Journal of Clinical Epidemiology on 29 May.

DATA-SHARING BEHAVIOUR. Graphic showing percentage of authors that were willing to share data.
Source: Livia Puljak et al

Puljak was “flabbergasted” that so few researchers actually shared their data. “There is a gap between what people say and what people do,” she says. “Only when we ask for the data can we see their attitude towards data sharing.”

“It’s quite dismaying that [researchers] are not coming forward with the data,” says Rebecca Li, who is executive director of non-profit global data-sharing platform Vivli and is based in Cambridge, Massachusetts…(More)”.

Machine Learning Can Predict Shooting Victimization Well Enough to Help Prevent It


Paper by Sara B. Heller, Benjamin Jakubowski, Zubin Jelveh & Max Kapustin: “This paper shows that shootings are predictable enough to be preventable. Using arrest and victimization records for almost 644,000 people from the Chicago Police Department, we train a machine learning model to predict the risk of being shot in the next 18 months. We address central concerns about police data and algorithmic bias by predicting shooting victimization rather than arrest, which we show accurately captures risk differences across demographic groups despite bias in the predictors. Out-of-sample accuracy is strikingly high: of the 500 people with the highest predicted risk, 13 percent are shot within 18 months, a rate 130 times higher than the average Chicagoan. Although Black male victims more often have enough police contact to generate predictions, those predictions are not, on average, inflated; the demographic composition of predicted and actual shooting victims is almost identical. There are legal, ethical, and practical barriers to using these predictions to target law enforcement. But using them to target social services could have enormous preventive benefits: predictive accuracy among the top 500 people justifies spending up to $123,500 per person for an intervention that could cut their risk of being shot in half….(More)”.

China’s Expanding Surveillance State


Article by  Isabelle Qian, Muyi Xiao, Paul Mozur and Alexander Cardia in the New York Times: “China’s ambition to collect a staggering amount of personal data from everyday citizens is more expansive than previously known, a Times investigation has found. Phone-tracking devices are now everywhere. The police are creating some of the largest DNA databases in the world. And the authorities are building upon facial recognition technology to collect voice prints from the general public.

The Times’s Visual Investigations team and reporters in Asia spent over a year analyzing more than a hundred thousand government bidding documents. They call for companies to bid on the contracts to provide surveillance technology, and include product requirements and budget size, and sometimes describe at length the strategic thinking behind the purchases. Chinese laws stipulate that agencies must keep records of bids and make them public, but in reality the documents are scattered across hard-to-search web pages that are often taken down quickly without notice. ChinaFile, a digital magazine published by the Asia Society, collected the bids and shared them exclusively with The Times.

This unprecedented access allowed The Times to study China’s surveillance capabilities. The Chinese government’s goal is clear: designing a system to maximize what the state can find out about a person’s identity, activities and social connections, which could ultimately help the government maintain its authoritarian rule.

Here are the investigation’s major revelations.

Analysts estimate that more than half of the world’s nearly one billion surveillance cameras are in China, but it had been difficult to gauge how they were being used, what they captured and how much data they generated. The Times analysis found that the police strategically chose locations to maximize the amount of data their facial recognition cameras could collect…

The Chinese authorities are realistic about their technological limitations. According to one bidding document, the Ministry of Public Security, China’s top police agency, believed the country’s video surveillance systems still lacked analytical capabilities. One of the biggest problems they identified was that the data had not been centralized….(More)”.

Meet the fact-checkers decoding Sri Lanka’s meltdown


Article by Nilesh Christopher: “On the evening of May 3, the atmosphere at Galle Face Green, an esplanade along the coastline of Sri Lanka’s capital city of Colombo, was carnivalesque. Parents strolled on sidewalks with toddlers hoisted on their shoulders. Teenagers wearing bandanas played the flute and blew plastic horns. People climbed atop makeshift podiums to address the crowds, greeted by scattered applause. 

The crowd of a few hundred was part of a series of protests that had been underway since mid-March, demanding the ouster of President Gotabaya Rajapaksa. For months, the country has been trapped in a brutal economic crisis: Sri Lanka is currently unable to pay for imports of essentials, such as food, medicines, and fuel. Populist tax cuts, an abrupt ban on fertilizer imports, decimated crop yields, and the collapse of tourism during the pandemic all helped to push the country into the worst economic crisis it has faced since gaining independence in 1948.

The island nation owes nearly $7 billion this year and has next to no foreign reserves left. “We don’t have any gas. We don’t have fuel and some food items. We lose power for three to four hours daily now,” Nalin Chamara, 42, a hotelier protesting with his family and children, told Rest of World. Meanwhile, the presidential family at one point controlled around 70% of the nation’s budget and ran it as a family business, spending billions of dollars of borrowed money on vanity projects, such as an extravagant airport and cricket stadium that now sit almost entirely unused.

Yudhanjaya Wijeratne walked among the Galle Face Green crowds, surveying the scene. He pointed out where demonstrators had jury-rigged their own electricity supply by welding solar panels atop an open truck and connecting them to a battery. The power generated was being used to charge over two dozen smartphones inside a big blue tent, which also contained a library housing 15,000 books. “This is what Sri Lankans will do if you let them build stuff. Fucking build infrastructure from scratch,” Wijeratne said. The protest featured a giant middle finger monument made of plastic bottles, directed at the Rajapaksas. “Our real educational export should be B.Sc. in protest,” Wijeratne said.

Wijeratne, 29 years old, is best known as the author of Numbercaste, a science fiction novel about a near-future world where people’s importance in society is decided based on the all-powerful Number, a credit score determined by their social circle and social network data. But he is also the chief executive of Watchdog, a research collective based in Colombo that uses fact-checking and open source intelligence (OSINT) methods to investigate Sri Lanka’s ongoing crisis. As part of its work, he and his 12-member team of coders, journalists, economists, and students track, time stamp, geolocate, and document videos of protests shared online.

Watchdog’s protest tracker has emerged as the most comprehensive online archive of the historic events unfolding in Sri Lanka. Its data set, which comprises 597 different protests and 49 conflicts, has been used by global news organizations to demonstrate the extent of public pushback.

“[Our] core mission is simple,” Wijeratne told Rest of World. “We want to help people understand the infrastructure they use. The concrete, the laws, the policies, and the social contracts that they live under. We want to help people understand the causality of how they came to be and how they operate.”…(More)”.