Invisible Women: Exposing Data Bias in a World Designed for Men


Book by Caroline Criado Perez: “Imagine a world where your phone is too big for your hand, where your doctor prescribes a drug that is wrong for your body, where in a car accident you are 47% more likely to be seriously injured, where every week the countless hours of work you do are not recognised or valued. If any of this sounds familiar, chances are that you’re a woman.

Invisible Women shows us how, in a world largely built for and by men, we are systematically ignoring half the population. It exposes the gender data gap – a gap in our knowledge that is at the root of perpetual, systemic discrimination against women, and that has created a pervasive but invisible bias with a profound effect on women’s lives.

Award-winning campaigner and writer Caroline Criado Perez brings together for the first time an impressive range of case studies, stories and new research from across the world that illustrate the hidden ways in which women are forgotten, and the impact this has on their health and well-being. From government policy and medical research, to technology, workplaces, urban planning and the media, Invisible Womenreveals the biased data that excludes women. In making the case for change, this powerful and provocative book will make you see the world anew….(More)”

Data Trusts: Ethics, Architecture and Governance for Trustworthy Data Stewardship


Web Science Institute Paper by Kieron O’Hara: “In their report on the development of the UK AI industry, Wendy Hall and Jérôme Pesenti
recommend the establishment of data trusts, “proven and trusted frameworks and agreements” that will “ensure exchanges [of data] are secure and mutually beneficial” by promoting trust in the use of data for AI. Hall and Pesenti leave the structure of data trusts open, and the purpose of this paper is to explore the questions of (a) what existing structures can data trusts exploit, and (b) what relationship do data trusts have to
trusts as they are understood in law?

The paper defends the following thesis: A data trust works within the law to provide ethical, architectural and governance support for trustworthy data processing

Data trusts are therefore both constraining and liberating. They constrain: they respect current law, so they cannot render currently illegal actions legal. They are intended to increase trust, and so they will typically act as
further constraints on data processors, adding the constraints of trustworthiness to those of law. Yet they also liberate: if data processors
are perceived as trustworthy, they will get improved access to data.

Most work on data trusts has up to now focused on gaining and supporting the trust of data subjects in data processing. However, all actors involved in AI – data consumers, data providers and data subjects – have trust issues which data trusts need to address.

Furthermore, it is not only personal data that creates trust issues; the same may be true of any dataset whose release might involve an organisation risking competitive advantage. The paper addresses four areas….(More)”.

Harnessing the Power of Open Data for Children and Families


Article by Kathryn L.S. Pettit and Rob Pitingolo: “Child advocacy organizations, such as members of the KIDS COUNT network, have proven the value of using data to advocate for policies and programs to improve the lives of children and families. These organizations use data to educate policymakers and the public about how children are faring in their communities. They understand the importance of high-quality information for policy and decisionmaking. And in the past decade, many state governments have embraced the open data movement. Their data portals promote government transparency and increase data access for a wide range of users inside and outside government.

At the request of the Annie E. Casey Foundation, which funds the KIDS COUNT network, the authors conducted research to explore how these state data efforts could bring greater benefits to local communities. Interviews with child advocates and open data providers confirmed the opportunity for child advocacy organizations and state governments to leverage open data to improve the lives of children and families. But accomplishing this goal will require new practices on both sides.

This brief first describes the current state of practice for child advocates using data and for state governments publishing open data. It then provides suggestions for what it would take from both sides to increase the use of open data to improve the lives of children and families. Child and family advocates will find five action steps in section 2. These steps encourage them to assess their data needs, build relationships with state data managers, and advocate for new data and preservation of existing data.
State agency staff will find five action steps in section 3. These steps describe how staff can engage diverse stakeholders, including agency staff beyond typical “data people” and data users outside government. Although this brief focuses on state-level institutions, local advocates an governments will find these lessons relevant. In fact, many of the lessons and best practices are based on pioneering efforts at the local level….(More)”.

Big data needs big governance: best practices from Brain-CODE, the Ontario Brain Institute’s neuroinformatics platform


Shannon C. Lefaivre et al in Frontiers of Genetics: “The Ontario Brain Institute (OBI) has begun to catalyze scientific discovery in the field of neuroscience through its large-scale informatics platform, known as Brain-CODE. The platform supports the capture, storage, federation, sharing and analysis of different data types across several brain disorders. Underlying the platform is a robust and scalable data governance structure which allows for the flexibility to advance scientific understanding, while protecting the privacy of research participants.

Recognizing the value of an open science approach to enabling discovery, the governance structure was designed not only to support collaborative research programs, but also to support open science by making all data open and accessible in the future. OBI’s rigorous approach to data sharing maintains the accessibility of research data for big discoveries without compromising privacy and security. Taking a Privacy by Design approach to both data sharing and development of the platform has allowed OBI to establish some best practices related to large scale data sharing within Canada. The aim of this report is to highlight these best practices and develop a key open resource which may be referenced during the development of similar open science initiatives….(More)”.

Balancing information governance obligations when accessing social care data for collaborative research


Paper by Malkiat Thiarai, Sarunkorn Chotvijit and Stephen Jarvis: “There is significant national interest in tackling issues surrounding the needs of vulnerable children and adults. This paper aims to argue that much value can be gained from the application of new data-analytic approaches to assist with the care provided to vulnerable children. This paper highlights the ethical and information governance issues raised in the development of a research project that sought to access and analyse children’s social care data.


The paper documents the process involved in identifying, accessing and using data held in Birmingham City Council’s social care system for collaborative research with a partner organisation. This includes identifying the data, its structure and format; understanding the Data Protection Act 1998 and 2018 (DPA) exemptions that are relevant to ensure that legal obligations are met; data security and access management; the ethical and governance approval process.


The findings will include approaches to understanding the data, its structure and accessibility tasks involved in addressing ethical and legal obligations and requirements of the ethical and governance processes….(More)”.

Blockchain and distributed ledger technologies in the humanitarian sector


Report by Giulio Coppi and Larissa Fast at ODI (Overseas Development Institute): “Blockchain and the wider category of distributed ledger technologies (DLTs) promise a more transparent, accountable, efficient and secure way of exchanging decentralised stores of information that are independently updated, automatically replicated and immutable. The key components of DLTs include shared recordkeeping, multi-party consensus, independent validation, tamper evidence and tamper resistance.

Building on these claims, proponents suggest DLTs can address common problems of non-profit organisations and NGOs, such as transparency, efficiency, scale and sustainability. Current humanitarian uses of DLT, illustrated in this report, include financial inclusion, land titling, remittances, improving the transparency of donations, reducing fraud, tracking support to beneficiaries from multiple sources, transforming governance systems, micro-insurance, cross-border transfers, cash programming, grant management and organisational governance.

This report, commissioned by the Global Alliance for Humanitarian Innovation (GAHI), examines current DLT uses by the humanitarian sector to outline lessons for the project, policy and system levels. It offers recommendations to address the challenges that must be overcome before DLTs can be ethically, safely, appropriately and effectively scaled in humanitarian contexts….(More)”.

Evolving Measurement for an Evolving Economy: Thoughts on 21st Century US Economic Statistics


Ron S. Jarmin at the Journal of Economic Perspectives: “The system of federal economic statistics developed in the 20th century has served the country well, but the current methods for collecting and disseminating these data products are unsustainable. These statistics are heavily reliant on sample surveys. Recently, however, response rates for both household and business surveys have declined, increasing costs and threatening quality. Existing statistical measures, many developed decades ago, may also miss important aspects of our rapidly evolving economy; moreover, they may not be sufficiently accurate, timely, or granular to meet the increasingly complex needs of data users. Meanwhile, the rapid proliferation of online data and more powerful computation make privacy and confidentiality protections more challenging. There is broad agreement on the need to transform government statistical agencies from the 20th century survey-centric model to a 21st century model that blends structured survey data with administrative and unstructured alternative digital data sources. In this essay, I describe some work underway that hints at what 21st century official economic measurement will look like and offer some preliminary comments on what is needed to get there….(More)”.

Can Data Save U.N. Peacekeeping?


Adam Day at World Policy Review: “Does international peacekeeping protect civilians caught up in civil wars? Do the 16,000 United Nations peacekeepers deployed in the Democratic Republic of the Congo actually save lives, and if so how many? Did the 9,000 patrols conducted by the U.N. Mission in South Sudan in the past three months protect civilians there? 

The answer is a dissatisfying “maybe.” Without a convincing story of saving lives, the U.N. is open to attacks by the likes of White House national security adviser John Bolton, who call peacekeeping “unproductive” and push for further cuts to the organization’s already diminished budget. But peacekeeping can—and must—make a case for its own utility, using data already at its fingertips. …(More)”.

Privacy and Smart Cities: A Canadian Survey


Report by Sara Bannerman and Angela Orasch: “This report presents the findings of a national survey of Canadians about smart-city privacy conducted in October and November 2018. Our research questions were: How concerned are Canadians about smart-city privacy? How do these concerns intersect with age, gender, ethnicity, and location? Moreover, what are the expectations of Canadians with regards to their ability to control, use, or opt-out of data collection in smart-city context? What rights and privileges do Canadians feel are appropriate with regard to data self-determination, and what types of data are considered more sensitive than others?

What is a smart city?
A ‘smart city’ adopts digital and data-driven technologies in the planning, management and delivery of municipal services. Information and communications technologies (ICTs), data analytics, and the internet of
things (IoT) are some of the main components of these technologies, joined by web design, online marketing campaigns and digital services. Such technologies can include smart utility and transportation infrastructure, smart cards, smart transit, camera and sensor networks, or data collection by businesses to provide customized advertisements or other services. Smart-city technologies “monitor, manage and regulate city flows and processes, often in real-time” (Kitchin 2014, 2).

In 2017, a framework agreement was established between Waterfront Toronto, the organization charged with revitalizing Toronto’s waterfront, and Sidewalk Labs, parent company of Google, to develop a smart city on Toronto’s Eastern waterfront (Sidewalk Toronto 2018). This news was met with questions and concerns from experts in data privacy and the public at large regarding what was to be included in Sidewalk Lab’s smart-city vision. How would the overall governance structure function? How were the privacy rights of residents going to be protected, and what mechanisms, if any, would ensure that protection? The Toronto waterfront is just one of numerous examples of smart-city developments….(More)”.

Consumers kinda, sorta care about their data


Kim Hart at Axios: “A full 81% of consumers say that in the past year they’ve become more concerned with how companies are using their data, and 87% say they’ve come to believe companies that manage personal data should be more regulated, according to a survey out Monday by IBM’s Institute for Business Value.

Yes, but: They aren’t totally convinced they should care about how their data is being used, and many aren’t taking meaningful action after privacy breaches, according to the survey. Despite increasing data risks, 71% say it’s worth sacrificing privacy given the benefits of technology.Show less

By the numbers:

  • 89% say technology companies need to be more transparent about their products
  • 75% say that in the past year they’ve become less likely to trust companies with their personal data
  • 88% say the emergence of technologies like AI increase the need for clear policies about the use of personal data.

The other side: Despite increasing awareness of privacy and security breaches, most consumers aren’t taking consequential action to protect their personal data.

  • Fewer than half (45%) report that they’ve updated privacy settings, and only 16% stopped doing business with an entity due to data misuse….(More)”.