World stumbling zombie-like into a digital welfare dystopia, warns UN human rights expert


UN Press Release: “A UN human rights expert has expressed concerns about the emergence of the “digital welfare state”, saying that all too often the real motives behind such programs are to slash welfare spending, set up intrusive government surveillance systems and generate profits for private corporate interests.

“As humankind moves, perhaps inexorably, towards the digital welfare future it needs to alter course significantly and rapidly to avoid stumbling zombie-like into a digital welfare dystopia,” the Special Rapporteur on extreme poverty and human rights, Philip Alston, says in a report to be presented to the General Assembly on Friday.

The digital welfare state is commonly presented as an altruistic and noble enterprise designed to ensure that citizens benefit from new technologies, experience more efficient government, and enjoy higher levels of well-being. But, Alston said, the digitization of welfare systems has very often been used to promote deep reductions in the overall welfare budget, a narrowing of the beneficiary pool, the elimination of some services, the introduction of demanding and intrusive forms of conditionality, the pursuit of behavioural modification goals, the imposition of stronger sanctions regimes, and a complete reversal of the traditional notion that the state should be accountable to the individual….(More)”.

Human Rights in the Age of Platforms


Book edited by Rikke Frank Jørgensen: “Today such companies as Apple, Facebook, Google, Microsoft, and Twitter play an increasingly important role in how users form and express opinions, encounter information, debate, disagree, mobilize, and maintain their privacy. What are the human rights implications of an online domain managed by privately owned platforms? According to the Guiding Principles on Business and Human Rights, adopted by the UN Human Right Council in 2011, businesses have a responsibility to respect human rights and to carry out human rights due diligence. But this goal is dependent on the willingness of states to encode such norms into business regulations and of companies to comply. In this volume, contributors from across law and internet and media studies examine the state of human rights in today’s platform society.

The contributors consider the “datafication” of society, including the economic model of data extraction and the conceptualization of privacy. They examine online advertising, content moderation, corporate storytelling around human rights, and other platform practices. Finally, they discuss the relationship between human rights law and private actors, addressing such issues as private companies’ human rights responsibilities and content regulation…(More)”.

Is Privacy and Personal Data Set to Become the New Intellectual Property?


Paper by Leon Trakman, Robert Walters, and Bruno Zeller: “A pressing concern today is whether the rationale underlying the protection of personal data is itself a meaningful foundation for according intellectual property (IP) rights in personal data to data subjects. In particular, are there particular technological attributes about the collection, use and processing of personal data on the Internet, and global access to that data, that provide a strong justification to extend IP rights to data subjects? A central issue in so determining is whether data subjects need the protection of such rights in a technological revolution in which they are increasingly exposed to the use and abuse of their personal data. A further question is how IP law can provide them with the requisite protection of their private space, or whether other means of protecting personal data, such as through general contract rights, render IP protections redundant, or at least, less necessary. This paper maintains that lawmakers often fail to distinguish between general property and IP protection of personal data; that IP protection encompasses important attributes of both property and contract law; and that laws that implement IP protection in light of its sui generis attributes are more fitting means of protecting personal data than the alternatives. The paper demonstrates that one of the benefits of providing IP rights in personal data goes some way to strengthening data subjects’ control and protection over their personal data and strengthening data protection law more generally. It also argues for greater harmonization of IP law across jurisdictions to ensure that the protection of personal data becomes more coherent and internationally sustainable….(More)”.

Real-time maps warn Hong Kong protesters of water cannons and riot police


Mary Hui at Quartz: “The “Be Water” nature of Hong Kong’s protests means that crowds move quickly and spread across the city. They might stage a protest in the central business district one weekend, then industrial neighborhoods and far-flung suburban towns the next. And a lot is happening at any one time at each protest. One of the key difficulties for protesters is to figure out what’s happening in the crowded, fast-changing, and often chaotic circumstances.

Citizen-led efforts to map protests in real-time are an attempt to address those challenges and answer some pressing questions for protesters and bystanders alike: Where should they go? Where have tear gas and water cannons been deployed? Where are police advancing, and are there armed thugs attacking civilians?

One of the most widely used real-time maps of the protests is HKMap.live, a volunteer-run and crowdsourced effort that officially launched in early August. It’s a dynamic map of Hong Kong that users can zoom in and out of, much like Google Maps. But in addition to detailed street and building names, this one features various emoji to communicate information at a glance: a dog for police, a worker in a yellow hardhat for protesters, a dinosaur for the police’s black-clad special tactical squad, a white speech-bubble for tear gas, two exclamation marks for danger.

HKMap during a protest on August 31, 2019.

Founded by a finance professional in his 20s and who only wished to be identified as Kuma, HKMap is an attempt to level the playing field between protesters and officers, he said in an interview over chat app Telegram. While earlier on in the protest movement people relied on text-based, on-the-ground  live updates through public Telegram channels, Kuma found these to be too scattered to be effective, and hard to visualize unless someone knew the particular neighborhood inside out.

“The huge asymmetric information between protesters and officers led to multiple occasions of surround and capture,” said Kuma. Passersby and non-frontline protesters could also make use of the map, he said, to avoid tense conflict zones. After some of his friends were arrested in late July, he decided to build HKMap….(More)”.

Reimagining Administrative Justice: Human Rights in Small Places


Book by Margaret Doyle and Nick O’Brien: “This book reconnects everyday justice with social rights. It rediscovers human rights in the ‘small places’ of housing, education, health and social care, where administrative justice touches the citizen every day, and in doing so it re-imagines administrative justice and expands its democratic reach. The institutions of everyday justice – ombuds, tribunals and mediation – rarely herald their role in human rights frameworks, and never very loudly. For the most part, human rights and administrative justice are ships that pass in the night. Drawing on design theory, the book proposes to remedy this alienation by replacing current orthodoxies, not least that of ‘user focus’, with more promising design principles of community, network and openness. Thus re-imagined, the future of both administrative justice and social rights is demosprudential, firmly rooted in making response to citizen grievance more democratic and embedding legal change in the broader culture….(More)”.

The Ethics of Hiding Your Data From the Machines


Molly Wood at Wired: “…But now that data is being used to train artificial intelligence, and the insights those future algorithms create could quite literally save lives.

So while targeted advertising is an easy villain, data-hogging artificial intelligence is a dangerously nuanced and highly sympathetic bad guy, like Erik Killmonger in Black Panther. And it won’t be easy to hate.

I recently met with a company that wants to do a sincerely good thing. They’ve created a sensor that pregnant women can wear, and it measures their contractions. It can reliably predict when women are going into labor, which can help reduce preterm births and C-sections. It can get women into care sooner, which can reduce both maternal and infant mortality.

All of this is an unquestionable good.

And this little device is also collecting a treasure trove of information about pregnancy and labor that is feeding into clinical research that could upend maternal care as we know it. Did you know that the way most obstetricians learn to track a woman’s progress through labor is based on a single study from the 1950s, involving 500 women, all of whom were white?…

To save the lives of pregnant women and their babies, researchers and doctors, and yes, startup CEOs and even artificial intelligence algorithms need data. To cure cancer, or at least offer personalized treatments that have a much higher possibility of saving lives, those same entities will need data….

And for we consumers, well, a blanket refusal to offer up our data to the AI gods isn’t necessarily the good choice either. I don’t want to be the person who refuses to contribute my genetic data via 23andMe to a massive research study that could, and I actually believe this is possible, lead to cures and treatments for diseases like Parkinson’s and Alzheimer’s and who knows what else.

I also think I deserve a realistic assessment of the potential for harm to find its way back to me, because I didn’t think through or wasn’t told all the potential implications of that choice—like how, let’s be honest, we all felt a little stung when we realized the 23andMe research would be through a partnership with drugmaker (and reliable drug price-hiker) GlaxoSmithKline. Drug companies, like targeted ads, are easy villains—even though this partnership actually couldproduce a Parkinson’s drug. But do we know what GSK’s privacy policy looks like? That deal was a level of sharing we didn’t necessarily expect….(More)”.

Corporate Duties to the Public


Book by Barnali Choudhury and Martin Petrin: “In a world where the grocery store may be more powerful than the government and corporations are the governors rather than the governed, the notion of corporations being only private actors is slowly evaporating. Gone is the view that corporations can focus exclusively on maximizing shareholder wealth. Instead, the idea that corporations owe duties to the public is capturing the attention of not only citizens and legislators, but corporations themselves. This book explores the deepening connections between corporations and the public. It explores timely – and often controversial – public issues with which corporations must grapple including the corporate purpose, civil and criminal liability, taxation, human rights, the environment and corruption. Offering readers an encompassing, balanced, and systematic understanding of the most pertinent duties corporations should bear, how they work, whether they are justified, and how they should be designed in the future, this book clarifies corporations’ roles vis-à-vis the public….(More)”.

How Can We Use Administrative Data to Prevent Homelessness among Youth Leaving Care?


Article by Naomi Nichols: “In 2017, I was part of a team of people at the Canadian Observatory on Homelessness and A Way Home Canada who wrote a policy brief titled, Child Welfare and Youth Homelessness in Canada: A proposal for action. Drawing on the results of the first pan-Canadian survey on youth homelessness, Without a Home: The National Youth Homelessness Surveythe brief focused on the disproportionate number of young people who had been involved with child protection services and then later became homeless. Indeed, 57.8% of homeless youth surveyed reported some type of involvement with child protection services over their lifetime. By comparison, in the general population, only 0.3% of young people receive child welfare service. This means, youth experiencing homelessness are far more likely to report interactions with the child welfare system than young people in the general population. 

Where research reveals systematic patterns of exclusion and neglect – that is, where findings reveal that one group is experiencing disproportionately negative outcomes (relative to the general population) in a particular public sector context – this suggests the need for changes in public policy, programming and practice. Since producing this brief, I have been working with an incredibly talented and passionate McGill undergraduate student (who also happens to be the Vice President of Youth in Care Canada), Arisha Khan. Together, we have been exploring just uses of data to better serve the interests of those young people who depend on the state for their access to basic services (e.g., housing, healthcare and food) as well as their self-efficacy and status as citizens. 

One component of this work revolved around a grant application that has just been funded by the Social Sciences and Humanities Research Council of Canada (Data Justice: Fostering equitable data-led strategies to prevent, reduce and end youth homelessness). Another aspect of our work revolved around a policy brief, which we co-wrote and published with the Montreal data-for-good organization, Powered by Data. The brief outlines how a rights-based and custodial approach to administrative data could a) effectively support young people in and leaving care to participate more actively in their transition planning and engage in institutional self-advocacy; and b) enable systemic oversight of intervention implementation and outcomes for young people in and leaving the provincial care system. We produced this brief with the hope that it would be useful to government decision-makers, service providers, researchers, and advocates interested in understanding how institutional data could be used to improve outcomes for youth in and leaving care. In particular, we wanted to explore whether a different orientation to data collection and use in child protection systems could prevent young people from graduating from provincial child welfare systems into homelessness. In addition to this practical concern, we also undertook to think through the ethical and human rights implications of more recent moves towards data-driven service delivery in Canada, focusing on how we might make this move with the best interests of young people in mind. 

As data collection, management and use practices have become more popularresearch is beginning to illuminate how these new monitoring, evaluative and predictive technologies are changing governance processes within and across the public sector, as well as in civil society. ….(More)”.

Stop Surveillance Humanitarianism


Mark Latonero at The New York Times: “A standoff between the United Nations World Food Program and Houthi rebels in control of the capital region is threatening the lives of hundreds of thousands of civilians in Yemen.

Alarmed by reports that food is being diverted to support the rebels, the aid program is demanding that Houthi officials allow them to deploy biometric technologies like iris scans and digital fingerprints to monitor suspected fraud during food distribution.

The Houthis have reportedly blocked food delivery, painting the biometric effort as an intelligence operation, and have demanded access to the personal data on beneficiaries of the aid. The impasse led the aid organization to the decision last month to suspend food aid to parts of the starving population — once thought of as a last resort — unless the Houthis allow biometrics.

With program officials saying their staff is prevented from doing its essential jobs, turning to a technological solution is tempting. But biometrics deployed in crises can lead to a form of surveillance humanitarianism that can exacerbate risks to privacy and security.

By surveillance humanitarianism, I mean the enormous data collection systems deployed by aid organizations that inadvertently increase the vulnerability of people in urgent need….(More)”.

How can Indigenous Data Sovereignty (IDS) be promoted and mainstreamed within open data movements?


OD Mekong Blog: “Considering Indigenous rights in the open data and technology space is a relatively new concept. Called “Indigenous Data Sovereignty” (IDS), it is defined as “the right of Indigenous peoples to govern the collection, ownership, and application of data about Indigenous communities, peoples, lands, and resources”, regardless of where the data is held or by whom. By default, this broad and all-encompassing framework bucks fundamental concepts of open data, and asks traditional open data practitioners to critically consider how open data can be used as a tool of transparency that also upholds equal rights for all…

Four main areas of concern and relevant barriers identified by participants were:

Self-determination to identify their membership

  • National governments in many states, particularly across Asia and South America, still do not allow for self-determination under the law. Even when legislation offers some recognition these are scarcely enforced, and mainstream discourse demonises Indigenous self-determination.
  • However, because Indigenous and ethnic minorities frequently face hardships and persecution on a daily basis, there were concerns about the applicability of data sovereignty at the local levels.

Intellectual Property Protocols

  • It has become the norm in the everyday lives of people for big tech companies to extract data in excessive amounts. How do disenfranchised communities combat this?
  • Indigenous data is often misappropriated to the detriment of Indigenous peoples.
  • Intellectual property concepts, such as copyright, are not an ideal approach for protecting Indigenous knowledge and intellectual property rights because they are rooted in commercialistic ideals that are difficult to apply to Indigenous contexts. This is especially so because many groups do not practice commercialization in the globalized context. Also, as a concept based on exclusivity (i.e., when licenses expire knowledge gets transferred over as public goods), it doesn’t take into account the collectivist ideals of Indigenous peoples.

Data Governance

  • Ultimately, data protection is about protecting lives. Having the ability to use data to direct decisions on Indigenous development places greater control in the hands of Indigenous peoples.
  • National governments are barriers due to conflicts in sovereignty interests. Nation-state legal systems are often contradictory to customary laws, and thus don’t often reflect rights-based approaches.

Consent — Free Prior and Informed Consent (FPIC)

  • FPIC, referring to a set of principles that define the process and mechanisms that apply specifically to Indigenous peoples in relation to the exercise of their collective rights, is a well-known phrase. They are intended to ensure that Indigenous peoples are treated as sovereign peoples with their own decision-making power, customary governance systems, and collective decision-making processes, but it is questionable as to what level one can ensure true FPIC in the Indigenous context.²
  • It remains a question as too how effectively due diligence can be applied to research protocols, so as to ensure that the rights associated with FPIC and the UNDRIP framework are upheld….(More)”.