Project Description

The Gender Data Divide

January 2021

Marketing Transformation

Hayley Burchall
Hayley BurchallSenior Account Manager

The Data Gender Divide

AI is learning to be sexist – and it’s our fault.  

Data has become a vital part of our world, we use it every day, especially in marketing. But are you checking your data isn’t misogynistic?  

When we feed AI data that is inadvertently biased, it will learn to be inadvertently prejudiced. Bias in data comes from gaps and assumptions, from the lack of females represented in the data. When women are assumed to follow the same patterns as men this just reflects the bias that exists in society. And according to a Harnham diversity report, in America, only 18% of data scientists are women, while 11% of data teams have no women at all!! No wonder the data gender divide has become an issue.  

Lack of female inclusion in the research data

This is not a new concept. Simone de Beauvoir famously quoted, “man defines woman not in herself, but relative to him. He is the Absolute – she is the Other.” When I say this has become an issue, I mean that when there is bias in big data, it can result in corrupted half-data. With big gaps. It is a big issue, meaning life and death in some instancesTake healthcare as an example. Only in 1993 (27 years ago) were women mandated to be included in medical trials. So most research into medical science either didn’t account for women at all or just slightly reduced the doses for them when they went to market!?! This issue is seen in Parkinson’s disease, ADHD and even in extremely time critical heart attack diagnoses for women.  

Gender biased language 

When you put rubbish data in, you get rubbish results out. A particularly interesting area you can see bias manifesting is in Natural Language Processing. Google translate previously presented gender biased translations of certain words and took steps to fix it. It now presents 2 alternatives to each translation. i.e. ‘she is a doctor’ AND ‘he is a doctor’. Technical Coleader of the Ethical Artificial Intelligence Team at Google, Timnit Gebru, has said she spends much of her time mitigating the harm that can result from the technology being misused. As if it’s an atomic bomb. 

There is also the huge issue of learnt embedded relationships between words. AI connecting the word woman’ directly to stereotypes such as ‘homemaker’ or ‘receptionist’ means that algorithms need to be transparent and tested for bias. If not, neither us nor the machines are providing useful insights in order to grow and make the world better.  

Female insights  

You’ve got to wonder, if more women were present in these data roles, could many of these issues have been avoided? Technology will always reflect the values of its creators. So, from the participation, the analysis, the algorithms, right up to the board members, inclusivity and representation of women should be essential. Otherwise, all we are doing is mirroring the old, familiar stereotypes of the past. Futurists are optimistic that the most valued skills in an AI workforce will be in areas that it falls short: social and emotional intelligence (EI). Positions that are predominantly held by women 

Women make up roughly half of the global population, so ensuring we are considering women in data decisions safeguards AI being useful(Just try not to use AI in your recruiting process, because it’s more likely to advertise those higher paid jobs to men)Designing a world that works for everyone, needs everyone represented in the room when designing.  

Who knows, perhaps airbags would have been designed to factor in your boobs and I might be able to hold an iPhone with one hand??  

Want to read more about the Digital Divide?

Try: The role of technology post-COVID19 

Related Articles

Related Topics