Gender Reboot: Reprogramming Gender Rights in the Age of AI

Book by Eleonore Fournier-Tombs: “This book explores gender norms and women’s rights in the age of AI. The author examines how gender dynamics have evolved in the spheres of work, self-image and safety, and education, and how these might be reflected in current challenges in AI development. The book also explores opportunities in AI to address issues facing women, and how we might harness current technological developments for gender equality. Taking a narrative tone, the book is interwoven with stories and a reflection on the raising young children during the COVID-19 pandemic. It includes both expert and personal interviews to create a nuanced and multidimensional perspective on the state of women’s rights and what might be done to move forward…(More)”.

India’s persistent, gendered digital divide

Article by Caiwei Chen: “In a society where women, especially unmarried girls, still have to fight to own a smartphone, would men — and institutional patriarchy — really be willing to share political power?

In September, the Indian government passed a landmark law, under which a third of the seats in the lower house and state assemblies would be reserved for women. Amid the euphoria of celebrating this development, a somewhat cynical question I’ve been thinking about is: Why do only 31% of women own a mobile phone in India compared to over 60% of men? This in a country that is poised to have 1 billion smartphone users by 2026.

It’s not that the euphoria is without merit. Twenty-seven years after the idea was first birthed, the Narendra Modi government was able to excavate the issue out of the deep freeze and breathe it back into life. The execution of the quota will still take a few years as it has been linked to the redrawing of constituency boundaries.

But in the meantime, as women, we should brace ourselves for the pushbacks — small and big — that will come our way.

In an increasingly wired world, this digital divide has real-life consequences.  

The gender gap — between men and women, boys and girls — isn’t only about cellular phones and internet access. This inequity perfectly encapsulates all the other biases that India’s women have had to contend with — from a disparity in education opportunities to overzealous moral policing. It is about denying women power — and even bodily autonomy…(More)”.

Building Responsive Investments in Gender Equality using Gender Data System Maturity Models

Tools and resources by Data2X and Open Data Watch: “.. to help countries check the maturity of their gender data systems and set priorities for gender data investments. The new Building Responsive Investments in Data for Gender Equality (BRIDGE) tool is designed for use by gender data focal points in national statistical offices (NSOs) of low- and middle- income countries and by their partners within the national statistical system (NSS) to communicate gender data priorities to domestic sources of financing and international donors.

The BRIDGE results will help gender data stakeholders understand the current maturity level of their gender data system, diagnose strengths and weaknesses, and identify priority areas for improvement. They will also serve as an input to any roadmap or action plan developed in collaboration with key stakeholders within the NSS.

Below are links to and explanations of our ‘Gender Data System Maturity Model’ briefs (a long and short version), our BRIDGE assessment and tools methodology, how-to guide, questionnaire, and scoring form that will provide an overall assessment of system maturity and insight into potential action plans to strengthen gender data systems…(More)”.

Destination? Care Blocks!

Blog by Natalia González Alarcón, Hannah Chafetz, Diana Rodríguez Franco, Uma Kalkar, Bapu Vaitla, & Stefaan G. Verhulst: “Time poverty” caused by unpaid care work overload, such as washing, cleaning, cooking, and caring for their care-receivers is a structural consequence of gender inequality. In the City of Bogotá, 1.2 million women — 30% of their total women’s population — carry out unpaid care work full-time. If such work was compensated, it would represent 13% of Bogotá’s GDP and 20% of the country’s GDP. Moreover, the care burden falls disproportionately on women’s shoulder and prevents them from furthering their education, achieving financial autonomy, participating in their community, and tending to their personal wellbeing.

To address the care burden and its spillover consequences on women’s economic autonomy, well-being and political participation, in October 2020, Bogotá Mayor Claudia López launched the Care Block Initiative. Care Blocks, or Manzanas del cuidado, are centralized areas for women’s economic, social, medical, educational, and personal well-being and advancement. They provide services simultaneously for caregivers and care-receivers.

As the program expands from 19 existing Care Blocks to 45 Care Blocks by the end of 2035, decision-makers face another issue: mobility is a critical and often limiting factor for women when accessing Care Blocks in Bogotá.

On May 19th, 2023, The GovLabData2X, and the Secretariat for Women’s Affairs, in the City Government of Bogotá co-hosted a studio that aimed to scope a purposeful and gender-conscious data collaborative that addresses mobility-related issues affecting the access of Care Blocks in Bogotá. Convening experts across the gender, mobility, policy, and data ecosystems, the studio focused on (1) prioritizing the critical questions as it relates to mobility and access to Care Blocks and (2) identifying the data sources and actors that could be tapped into to set up a new data collaborative…(More)”.

“My sex-related data is more sensitive than my financial data and I want the same level of security and privacy”: User Risk Perceptions and Protective Actions in Female-oriented Technologies

Paper by Maryam Mehrnezhad, and Teresa Almeida: “The digitalization of the reproductive body has engaged myriads of cutting-edge technologies in supporting people to know and tackle their intimate health. Generally understood as female technologies (aka female-oriented technologies or ‘FemTech’), these products and systems collect a wide range of intimate data which are processed, transferred, saved and shared with other parties. In this paper, we explore how the “data-hungry” nature of this industry and the lack of proper safeguarding mechanisms, standards, and regulations for vulnerable data can lead to complex harms or faint agentic potential. We adopted mixed methods in exploring users’ understanding of the security and privacy (SP) of these technologies. Our findings show that while users can speculate the range of harms and risks associated with these technologies, they are not equipped and provided with the technological skills to protect themselves against such risks. We discuss a number of approaches, including participatory threat modelling and SP by design, in the context of this work and conclude that such approaches are critical to protect users in these sensitive systems…(More)”.

Can Mobility of Care Be Identified From Transit Fare Card Data? A Case Study In Washington D.C.

Paper by Daniela Shuman, et al: “Studies in the literature have found significant differences in travel behavior by gender on public transit that are largely attributable to household and care responsibilities falling disproportionately on women. While the majority of studies have relied on survey and qualitative data to assess “mobility of care”, we propose a novel data-driven workflow utilizing transit fare card transactions, name-based gender inference, and geospatial analysis to identify mobility of care trip making. We find that the share of women travelers trip-chaining in the direct vicinity of mobility of care places of interest is 10% – 15% higher than men….(More)”.

Big data proves mobility is not gender-neutral

Blog by Ellin Ivarsson, Aiga Stokenberg and Juan Ignacio Fulponi: “All over the world, there is growing evidence showing that women and men travel differently. While there are many reasons behind this, one key factor is the persistence of traditional gender norms and roles that translate into different household responsibilities, different work schedules, and, ultimately, different mobility needs. Greater overall risk aversion and sensitivity to safety issues also play an important role in how women get around. Yet gender often remains an afterthought in the transport sector, meaning most policies or infrastructure investment plans are not designed to take into account the specific mobility needs of women.

The good news is that big data can help change that. In a recent study, the World Bank Transport team combined several data sources to analyze how women travel around the Buenos Aires Metropolitan Area (AMBA), including mobile phone signal data, congestion data from Waze, public transport smart card data, and data from a survey implemented by the team in early 2022 with over 20,300 car and motorcycle users.

Our research revealed that, on average, women in AMBA travel less often than men, travel shorter distances, and tend to engage in more complex trips with multiple stops and purposes. On average, 65 percent of the trips made by women are shorter than 5 kilometers, compared to 60 percent among men. Also, women’s hourly travel patterns are different, with 10 percent more trips than men during the mid-day off-peak hour, mostly originating in central AMBA. This reflects the larger burden of household responsibilities faced by women – such as picking children up from school – and the fact that women tend to work more irregular hours…(More)” See also Gender gaps in urban mobility.

How Data Happened: A History from the Age of Reason to the Age of Algorithms

Book by Chris Wiggins and Matthew L Jones: “From facial recognition—capable of checking people into flights or identifying undocumented residents—to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn’t just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search.

Expanding on the popular course they created at Columbia University, Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom?

Wiggins and Jones focus on these questions as they trace data’s historical arc, and look to the future. By understanding the trajectory of data—where it has been and where it might yet go—Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose…(More)”.

‘There is no standard’: investigation finds AI algorithms objectify women’s bodies

Article by Hilke Schellmann: “Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women’s bodies.

These AI tools, developed by large technology companies, including Google and Microsoft, are meant to protect users by identifying violent or pornographic visuals so that social media companies can block it before anyone sees it. The companies claim that their AI tools can also detect “raciness” or how sexually suggestive an image is. With this classification, platforms – including Instagram and LinkedIn – may suppress contentious imagery.

Two Guardian journalists used the AI tools to analyze hundreds of photos of men and women in underwear, working out, using medical tests with partial nudity and found evidence that the AI tags photos of women in everyday situations as sexually suggestive. They also rate pictures of women as more “racy” or sexually suggestive than comparable pictures of men. As a result, the social media companies that leverage these or similar algorithms have suppressed the reach of countless images featuring women’s bodies, and hurt female-led businesses – further amplifying societal disparities.

Even medical pictures are affected by the issue. The AI algorithms were tested on images released by the US National Cancer Institute demonstrating how to do a clinical breast examination. Google’s AI gave this photo the highest score for raciness, Microsoft’s AI was 82% confident that the image was “explicitly sexual in nature”, and Amazon classified it as representing “explicit nudity”…(More)”.

A ‘Feminist’ Server to Help People Own Their Own Data

Article by Padmini Ray Murray: “All of our digital lives reside on servers – mostly in corporate server farms owned by the likes of Google, Amazon, Apple, and Microsoft.  These farms contain machines that store massive volumes of data generated by every single user of the internet. These vast infrastructures allow people to store, connect, and exchange information on the internet. 

Consequently, there is a massive distance between users and where and how the data is stored, which means that individuals have very little control over how their data is stored and used. However, due to the huge reliance on these massive corporate technologies, individuals are left with very little choice but to accept the terms dictated by these businesses. The conceptual alternative of the feminist server was created by groups of feminist and queer activists who were concerned about how little power they have over owning and managing their data on the internet. The idea of the feminist server was described as a project that is interested in “creating a more autonomous infrastructure to ensure that data, projects and memory of feminist groups are properly accessible, preserved and managed” – a safe digital library to store and manage content generated by feminist groups. This was also a direct challenge to the traditionally male-dominated spaces of computer hardware management, spaces which could be very exclusionary and hostile to women or queer individuals who might be interested in learning how to use these technologies. 

There are two related ways by which a server can be considered as feminist. The first is based on who runs the server, and the second is based on who owns the server. Feminist critics have pointed out how the running of servers is often in the hands of male experts who are not keen to share and explain the knowledge required to maintain a server – a role known as a systems admin or, colloquially, a “sysadmin” person. Thus the concept of feminist servers emerged out of a need to challenge patriarchal dominance in hardware and infrastructure spaces, to create alternatives that were nurturing, anti-capitalist, and worked on the basis of community and solidarity…(More)”.