“Alexa, why are women 47 percent more likely than men to get seriously injured in a car crash?”
If you are a woman, there is a distinct possibility that Alexa (the Google assistant) and SIRI (the Apple assistant) are less likely to process your voice accurately. In fact, Google’s voice recognition software is 70% less likely to recognize women’s voices accurately.
Translation software also has an inherent gender bias. If you type doctor into Bing Translate, you will get medico for the Spanish translation. But if you want to get a feminine result, you have to specify female to get medica. But you do not have to do this for a male doctor. We find another example of gender bias in virtual reality, where headsets are too big for the average size of a woman’s head and women are more likely to experience motion sickness than men. The problem is not women. It is the gender data gap.
The consequences of the examples above may seem minor, but what if they were life-threatening?
In the same car crash, women are 17 percent more likely than men to die. Women are also 47 percent more likely to get seriously injured (life-changing injuries, not just a scratch) in a car crash. Why? This is because vehicle safety systems such as airbags and seatbelts are designed and tested based on male-shaped body dummies. In 2015, the EU decided to finally adjust their testing to suit women and introduced a female car crash test dummy. However, they introduced this dummy in one of the five regulatory tests and only in the passenger seats.
Designing new technologies as a piece of jewelry or turning them pink does not compensate for all the flaws in their design.
It’s not you, it’s the gender data gap
In her book, Invisible Women, celebrated feminist advocate Caroline Criado-Perez does impressive research on the gender data gap. This gap stems from the fact that data about the typical male body and male life patterns are seen as universal, whereas those of women — who account for half of the world’s population data — are being seen as a niche. The consequences of this can take the form of Google Maps giving women the fastest route to a location instead of the safest one, pedometers that do not count the steps while women are pushing strollers because they do not move their arms, or phones being too large for women’s hands and smartwatches being too big for their wrists.
Adding to this, manufacturers of the Samsung Smartwatch and Apple Watch often feel they have made their respective products suitable for women by turning them into ‘a piece of jewelry’ or coloring them pink. But designing new technologies as a piece of jewelry or turning them pink does not compensate for all the flaws in their design. It only focuses on the superficial aesthetics, glossing over the fact that these gadgets are too heavy for a woman’s wrist and that they do not have any apps that actually make women’s lives more efficient.
The lack of sex-disaggregated data — data that are collected and analyzed separately on males and females — has also resulted in an incomplete picture of women’s necessities in the Fourth Industrial Revolution. But it is vital to note that the gender data gap is not generally malicious. Humans tend to test projects on themselves first, and since the tech industry is male-dominated, so are its prototypes. The core problem lies in the way of thinking that has been around for centuries. As a result, it is not thinking about women, in all its many forms, that has led to the gender data gap. Because when we say humans, 9 times out of 10, we mean men.
If only half of the world’s population is being represented at the top of these big tech companies, the gender data gap will not go away.
Understanding how machine-learning works is also essential to getting a grasp on the dynamics of the gender data gap. Because data is another word for information, and if we are feeding these algorithms data that don’t include women, it creates a recipe for intended or unintended designs biased towards men. Thus when a deep-learning algorithm is trained using data that is biased, it gets only better and better at being biased, with all the intended or unintended consequences.
This begs the question: how can we decode a world that is designed for men?
Breaking through ‘the virtual glass ceiling’
Decoding the gender data gap is too important to be left to men alone. Diversity in big tech companies should be taken as seriously as the code of the algorithms they are writing and protecting all the time.
If only half of the world’s population is being represented at the top of these big tech companies, the gender data gap will not go away. Thus women need to enter the so-called boys club at Silicon Valley. The need for female representation in tech companies is vital for the simple reason that women do not forget about women. Also, women are more aware of female-specific needs, and that will influence the types of research tech companies conduct. Female representation is more than only women getting the job, it is also about achieving a higher aim: closing the gender data gap. Because it is difficult for women to convince others of a need that exists if others do not have this need themselves.
Besides, according to different studies, the more diverse a company’s leadership is, the more innovative it gets. And the presence of diverse perspectives makes a company better informed about all their customers. Thus it can be concluded that women are leading the way when it comes to decoding the gender data dap.
The current technology designs are disadvantaging women and keeping them from doing their work effectively — it can even cost them their jobs.
But it doesn’t end there. We also need to push for legislative change. The more laws and regulations get introduced to remind the tech industry, for example, to collect sex-disaggregated data (and at the very beginning of every process, mind you), the less likely they are to forget about women.
And now that you see us…
Due to the rapid spread of artificial intelligence and big data, the gender data gap should receive more urgent attention than ever before. The current technology designs are disadvantaging women and keeping them from doing their work effectively — it can even cost them their jobs. Think of, for example, Amazon’s biased hiring algorithm. It is also affecting their comfort, health, and safety. Technology can provide speed, growth, and efficiency, but it cannot provide women with a world in which they also belong. And this is not a matter of abstract morality — in the real world, women are not an exception or a niche demographic, they are half of the audience.
Gender does have an impact on the questions we ask and, eventually, it comes down to who is making the decisions. So it can be concluded that when women are involved in decision making, research, and technology, women will not be forgotten. Diversity is not only a box to be checked off, it is the key to designing a world that also works for women. The answer is clear: all the tech industry had to do all along was ask women. Easy.
Gabriella Obispa is a guest writer for Profound. She is a master’s student majoring in International Technology and Law at the Vrije Universiteit Amsterdam. As a feminist and ‘woman in tech’, she is committed to the empowerment of women and diversity within the tech scene.