How does a computer ‘see’ gender?


Pew Research Center: “Machine vision tools like facial recognition are increasingly being used for law enforcement, advertising, and other purposes. Pew Research Center itself recently used a machine vision system to measure the prevalence of men and women in online image search results. This kind of system develops its own rules for identifying men and women after seeing thousands of example images, but these rules can be hard for to humans to discern. To better understand how this works, we showed images of the Center’s staff members to a trained machine vision system similar to the one we used to classify image searches. We then systematically obscured sections of each image to see which parts of the face caused the system to change its decision about the gender of the person pictured. Some of the results seemed intuitive, others baffling. In this interactive challenge, see if you can guess what makes the system change its decision.

Here’s how it works:…(More)”.

Gender Gaps in Urban Mobility


Brief of the Data 2X Big Data and Gender Brief Series by The GovLab, UNICEF, Universidad Del Desarrollo, Telefónica R&D Center, ISI Foundation, and DigitalGlobe: “Mobility is gendered. For example, the household division of labor in many societies leads women and girls to take more multi-purpose, multi-stop trips than men. Women-headed households also tend to work more in the informal sector, with limited access to transportation subsidies, and use of public transit is further reduced by the risk of violence in public spaces.

This brief summarizes a recent analysis of gendered urban mobility in 51 (out of 52) neighborhoods of Santiago, Chile, relying on the call detail records (CDRs) of a large sample of mobile phone users over a period of three months. We found that: 1) women move less overall than men; 2) have a smaller radius of movement; and 3) tend to concentrate their time in a smaller set of locations. These mobility gaps are linked to lower average incomes and fewer public and private transportation options. These insights, taken from large volumes of passively generated, inexpensive data streaming in realtime, can help policymakers design more gender inclusive urban transit systems….(More)”.

Raw data won’t solve our problems — asking the right questions will


Stefaan G. Verhulst in apolitical: “If I had only one hour to save the world, I would spend fifty-five minutes defining the questions, and only five minutes finding the answers,” is a famous aphorism attributed to Albert Einstein.

Behind this quote is an important insight about human nature: Too often, we leap to answers without first pausing to examine our questions. We tout solutions without considering whether we are addressing real or relevant challenges or priorities. We advocate fixes for problems, or for aspects of society, that may not be broken at all.

This misordering of priorities is especially acute — and represents a missed opportunity — in our era of big data. Today’s data has enormous potential to solve important public challenges.

However, policymakers often fail to invest in defining the questions that matter, focusing mainly on the supply side of the data equation (“What data do we have or must have access to?”) rather than the demand side (“What is the core question and what data do we really need to answer it?” or “What data can or should we actually use to solve those problems that matter?”).

As such, data initiatives often provide marginal insights while at the same time generating unnecessary privacy risks by accessing and exploring data that may not in fact be needed at all in order to address the root of our most important societal problems.

A new science of questions

So what are the truly vexing questions that deserve attention and investment today? Toward what end should we strategically seek to leverage data and AI?

The truth is that policymakers and other stakeholders currently don’t have a good way of defining questions or identifying priorities, nor a clear framework to help us leverage the potential of data and data science toward the public good.

This is a situation we seek to remedy at The GovLab, an action research center based at New York University.

Our most recent project, the 100 Questions Initiative, seeks to begin developing a new science and practice of questions — one that identifies the most urgent questions in a participatory manner. Launched last month, the goal of this project is to develop a process that takes advantage of distributed and diverse expertise on a range of given topics or domains so as to identify and prioritize those questions that are high impact, novel and feasible.

Because we live in an age of data and much of our work focuses on the promises and perils of data, we seek to identify the 100 most pressing problems confronting the world that could be addressed by greater use of existing, often inaccessible, datasets through data collaboratives – new forms of cross-disciplinary collaboration beyond public-private partnerships focused on leveraging data for good….(More)”.

Sharing Private Data for Public Good


Stefaan G. Verhulst at Project Syndicate: “After Hurricane Katrina struck New Orleans in 2005, the direct-mail marketing company Valassis shared its database with emergency agencies and volunteers to help improve aid delivery. In Santiago, Chile, analysts from Universidad del Desarrollo, ISI Foundation, UNICEF, and the GovLab collaborated with Telefónica, the city’s largest mobile operator, to study gender-based mobility patterns in order to design a more equitable transportation policy. And as part of the Yale University Open Data Access project, health-care companies Johnson & Johnson, Medtronic, and SI-BONE give researchers access to previously walled-off data from 333 clinical trials, opening the door to possible new innovations in medicine.

These are just three examples of “data collaboratives,” an emerging form of partnership in which participants exchange data for the public good. Such tie-ups typically involve public bodies using data from corporations and other private-sector entities to benefit society. But data collaboratives can help companies, too – pharmaceutical firms share data on biomarkers to accelerate their own drug-research efforts, for example. Data-sharing initiatives also have huge potential to improve artificial intelligence (AI). But they must be designed responsibly and take data-privacy concerns into account.

Understanding the societal and business case for data collaboratives, as well as the forms they can take, is critical to gaining a deeper appreciation the potential and limitations of such ventures. The GovLab has identified over 150 data collaboratives spanning continents and sectors; they include companies such as Air FranceZillow, and Facebook. Our research suggests that such partnerships can create value in three main ways….(More)”.

Journalism Initiative Crowdsources Feedback on Failed Foreign Aid Projects


Abigail Higgins at SSIR: “It isn’t unusual that a girl raped in northeastern Kenya would be ignored by law enforcement. But for Mary, whose name has been changed to protect her identity, it should have been different—NGOs had established a hotline to report sexual violence just a few years earlier to help girls like her get justice. Even though the hotline was backed by major aid institutions like Mercy Corps and the British government, calls to it regularly went unanswered.

“That was the story that really affected me. It touched me in terms of how aid failures could impact someone,” says Anthony Langat, a Nairobi-based reporter who investigated the hotline as part of a citizen journalism initiative called What Went Wrong? that examines failed foreign aid projects.

Over six months in 2018, What Went Wrong? collected 142 reports of failed aid projects in Kenya, each submitted over the phone or via social media by the very people the project was supposed to benefit. It’s a move intended to help upend the way foreign aid is disbursed and debated. Although aid organizations spend significant time evaluating whether or not aid works, beneficiaries are often excluded from that process.

“There’s a serious power imbalance,” says Peter DiCampo, the photojournalist behind the initiative. “The people receiving foreign aid generally do not have much say. They don’t get to choose which intervention they want, which one would feel most beneficial for them. Our goal is to help these conversations happen … to put power into the hands of the people receiving foreign aid.”

What Went Wrong? documented eight failed projects in an investigative series published by Devex in March. In Kibera, one of Kenya’s largest slums, public restrooms meant to improve sanitation failed to connect to water and sewage infrastructure and were later repurposed as churches. In another story, the World Bank and local thugs struggled for control over the slum’s electrical grid….(More)”

Invisible Women: Exposing Data Bias in a World Designed for Men


Book by Caroline Criado Perez: “Imagine a world where your phone is too big for your hand, where your doctor prescribes a drug that is wrong for your body, where in a car accident you are 47% more likely to be seriously injured, where every week the countless hours of work you do are not recognised or valued. If any of this sounds familiar, chances are that you’re a woman.

Invisible Women shows us how, in a world largely built for and by men, we are systematically ignoring half the population. It exposes the gender data gap – a gap in our knowledge that is at the root of perpetual, systemic discrimination against women, and that has created a pervasive but invisible bias with a profound effect on women’s lives.

Award-winning campaigner and writer Caroline Criado Perez brings together for the first time an impressive range of case studies, stories and new research from across the world that illustrate the hidden ways in which women are forgotten, and the impact this has on their health and well-being. From government policy and medical research, to technology, workplaces, urban planning and the media, Invisible Womenreveals the biased data that excludes women. In making the case for change, this powerful and provocative book will make you see the world anew….(More)”

Using Artificial Intelligence to Promote Diversity


Paul R. Daugherty, H. James Wilson, and Rumman Chowdhury at MIT Sloan Management Review:  “Artificial intelligence has had some justifiably bad press recently. Some of the worst stories have been about systems that exhibit racial or gender bias in facial recognition applications or in evaluating people for jobs, loans, or other considerations. One program was routinely recommending longer prison sentences for blacks than for whites on the basis of the flawed use of recidivism data.

But what if instead of perpetuating harmful biases, AI helped us overcome them and make fairer decisions? That could eventually result in a more diverse and inclusive world. What if, for instance, intelligent machines could help organizations recognize all worthy job candidates by avoiding the usual hidden prejudices that derail applicants who don’t look or sound like those in power or who don’t have the “right” institutions listed on their résumés? What if software programs were able to account for the inequities that have limited the access of minorities to mortgages and other loans? In other words, what if our systems were taught to ignore data about race, gender, sexual orientation, and other characteristics that aren’t relevant to the decisions at hand?

AI can do all of this — with guidance from the human experts who create, train, and refine its systems. Specifically, the people working with the technology must do a much better job of building inclusion and diversity into AI design by using the right data to train AI systems to be inclusive and thinking about gender roles and diversity when developing bots and other applications that engage with the public.

Design for Inclusion

Software development remains the province of males — only about one-quarter of computer scientists in the United States are women— and minority racial groups, including blacks and Hispanics, are underrepresented in tech work, too.  Groups like Girls Who Code and AI4ALL have been founded to help close those gaps. Girls Who Code has reached almost 90,000 girls from various backgrounds in all 50 states,5 and AI4ALL specifically targets girls in minority communities….(More)”.

Crowdsourced data informs women which streets are safe


Springwise“Safe & the City is a free app designed to help users identify which streets are safe for them. Sexual harassment and violent crimes against women in particular are a big problem in many urban environments. This app uses crowdsourced data and crime statistics to help female pedestrians stay safe.

It is a development of traditional navigation apps but instead of simply providing the fastest route, it also has information on what is the safest. The Live Map relies on user data. Victims can report harassment or assault on the app. The information will then be available to other users to warn them of a potential threat in the area. Incidents can be ranked from a feeling of discomfort or threat, verbal harassment, or a physical assault. Whilst navigating, the Live Map can also alert users to potentially dangerous intersections coming. This reminds people to stay alert and not only focus on their phone while walking.

The Safe Sites feature is also a way of incorporating the community. Businesses and organisations can register to be Safe Sites. They will then receive training from SafeSeekers in how to provide the best support and assistance in emergency situations. The locations of such Sites will be available on the app, should a user need one.

The IOS app launched in March 2018 on International Women’s Day. It is currently only available for London…(More)”

Tricky Design: The Ethics of Things


Book edited by Tom Fisher and Lorraine Gamman: “Tricky Things responds to the burgeoning of scholarly interest in the cultural meanings of objects, by addressing the moral complexity of certain designed objects and systems.

The volume brings together leading international designers, scholars and critics to explore some of the ways in which the practice of design and its outcomes can have a dark side, even when the intention is to design for the public good. Considering a range of designed objects and relationships, including guns, eyewear, assisted suicide kits, anti-rape devices, passports and prisons, the contributors offer a view of design as both progressive and problematic, able to propose new material and human relationships, yet also constrained by social norms and ideology. 

This contradictory, tricky quality of design is explored in the editors’ introduction, which positions the objects, systems, services and ‘things’ discussed in the book in relation to the idea of the trickster that occurs in anthropological literature, as well as in classical thought, discussing design interventions that have positive and negative ethical consequences. These will include objects, both material and ‘immaterial’, systems with both local and global scope, and also different processes of designing. 

This important new volume brings a fresh perspective to the complex nature of ‘things‘, and makes a truly original contribution to debates in design ethics, design philosophy and material culture….(More)”

Crowd-mapping gender equality – a powerful tool for shaping a better city launches in Melbourne


Nicole Kalms at The Conversation: “Inequity in cities has a long history. The importance of social and community planning to meet the challenge of creating people-centred cities looms large. While planners, government and designers have long understood the problem, uncovering the many important marginalised stories is an enormous task.

ion: “Inequity in cities has a long history. The importance of social and community planning to meet the challenge of creating people-centred cities looms large. While planners, government and designers have long understood the problem, uncovering the many important marginalised stories is an enormous task.

Technology – so often bemoaned – has provided an unexpected and powerful primary tool for designers and makers of cities. Crowd-mapping asks the community to anonymously engage and map their experiences using their smartphones and via a web app. The focus of the new Gender Equality Map launched today in two pilot locations in Melbourne is on equality or inequality in their neighbourhood.

How does it work?

Participants can map their experience of equality or inequality in their neighbourhood using locator pins. Author provided

Crowd-mapping generates geolocative data. This is made up of points “dropped” to a precise geographical location. The data can then be analysed and synthesised for insights, tendencies and “hotspots”.

The diversity of its applications shows the adaptability of the method. The digital, community-based method of crowd-mapping has been used across the globe. Under-represented citizens have embraced the opportunity to tell their stories as a way to engage with and change their experience of cities….(More)”