The Cost of Exclusion
Why AI's Gender Gap Affects Everyone
The deadliest design flaw in automotive history wasn't mechanical; it was masculine. Despite crash test dummies being used since 1949, it wasn’t until 2022 that the first dummy representing a woman's body was used. More than 70 years where all safety features were designed for only half of the population. What did this mean for women? Despite being less likely to be involved in car crashes, they are 47% more likely to be seriously injured (and 17% more likely to die) than men.
Women in police forces and militaries are also more at risk than their male counterparts because of gender blind design. Protective gear is often ill-suited to their body types- being too bulky on their shoulders, digging into their hips, or not fitting over their breasts. The consequences of male-centric design cascade from daily discomfort to potentially fatal outcomes. Humans are masters at solving problems, but we have a tendency to be blind to the struggles of others and too often try to solve problems from a single perspective.
Now, we're building one of the most transformative technologies in human history, artificial intelligence, and we're making the same mistakes all over again. We’re building tools that disproportionately harm women, in part by not including them in the conversation or addressing their needs. According to the World Economic Forum, only 22% of AI professionals globally are female. If this number seems familiar, it should. It's part of a pattern we've seen play out across the tech industry, where women make up just 20% of technical roles at major companies.
Our Past Is Programming Our Future
Technology is not created in a vacuum. It is molded by the many hands of our past and guided by how we envision our future. Even if, in this new future, we aim to create technology that is better than the collage of the worst parts of ourselves, our AIs are trained on decades of data that, at best, equate men with human, and, at worst, perpetrate decades of misogyny.
Our data is flawed because human history is flawed. For example, we know far less about how the female body works compared to men because, until 1993, women were excluded from medical studies due to concerns about menstrual cycles. This created a knowledge gap, a knowledge cavern even, leading to adverse reactions and misdiagnoses. Even today, AI diagnostic systems often misinterpret women’s symptoms due to incomplete data.
Another more recent example is LLM being trained by scraping the internet. When we train AI models on internet data, we're not just teaching them language, we're imparting culture, attitudes, and biases. Reddit, a popular source for training data, has a user base that's two-thirds male. While there are plenty of wholesome corners of Reddit dedicated to science and cat memes, it's also notorious for being a breeding ground for manoverse edgelord culture.
The cultural tilt of the training data inevitably skews how AI interprets and processes information. The Berkeley Haas Center for Equity, Gender, and Leadership found that about 44% of AI systems showed gender bias, and 25% exhibited both gender and racial bias. We might not have consciously chosen to include harmful data, but artificial intelligence is a mirror; it reflects back the biases present in our society, sometimes with stunning, and often uncomfortable clarity.
Breaking the Silicon Ceiling
The 'pipeline problem' has become tech's favorite alibi for gender inequality- it’s not our fault women aren’t interested in this field, but it masks deeper, more systemic issues. Women continue to outpace men in university education and are entering STEM fields in record numbers, they just aren’t staying. The World Economic Forum reported in 2023 that women accounted for 29% of all STEM workers, but they're concentrated in entry-level positions and drop off every rung of the corporate ladder after that point.
There is a similar gender diversity crisis in AI research, with just 13.8% of the authors being women, compared to 15.5% for STEM subjects, with numbers decreasing over the last 10 years. Apart from the University of Washington, none of the top 35 institutions for AI research have more than 25% listed as being authored by women.
Research compiled by the Datatech Analytics for the Women in Data campaign found that only 25% of UK jobs in artificial intelligence and other specialist technology roles were filled by women in 2019 – the lowest proportion in two decades. Numbers elsewhere reflect a similar figure.
The tech workplace culture itself can be a barrier, especially in countries like the United States, where PTO and parental leave barely exist. Many in tech or start-up culture share a mentality of working ridiculously long hours, a schedule that's often impossible for women who still shoulder a significant majority of domestic and care responsibilities. Add to that the subtle (and sometimes not-so-subtle) biases: being talked over in meetings, having ideas attributed to male colleagues, or facing harassment. It's death by a thousand cuts, and many talented women eventually decide it's not worth the battle.
Disproportionate harm
With women being excluded from the development and building of AIs, it is no surprise that women face significantly more harm from AI. For example, research from the McKinsey Global Institute finds that the bulk of AI-induced job losses will affect women without college degrees because those women disproportionately populate the entry-level jobs likely to be most affected by automation. Occupations such as administrative assistants, retail clerks, and finance personnel are already seeing job cuts, and this trend could accelerate as AI is deployed more widely and ubiquitously in many sectors.
Women also worry about the impact of AI and emerging technologies on personal security. "Fake nude" and "revenge porn" problems have been around since the early days of Photoshop, but the problem now is supercharged. A 2019 report published by Deeptrace Labs reported that of 15,000 deepfake videos it found online, an almost unbelievable (and yet also somehow not that surprising) 96% were non-consensual pornographic content featuring the faces of women superimposed on sexual images. Deepfake pornography is weaponized against women—famous figures, victims of harassment, and young girls who merely exist online.
Charting a Better Course
In the current AI landscape, benefits and risks are not equitably distributed, with power concentrated in the hands of a few corporations, states, and individuals who control talent, data, and computer resources. There is also no mechanism to look at broader considerations, like new forms of social vulnerability generated by AI, the disruption of industries and labour markets, the propensity for emerging technology to be used as a tool of oppression, the sustainability of the AI supply chain, or the impact of AI on future generations.
Leaving half the population out of the development process limits our potential and results in subpar technology, not just for women, but for all of us.
Solving this problem will not be a straight line nor an easy task. We need a multi-pronged approach:
1. Prioritize gender equality from the very beginning of AI development, not as an afterthought
2. Assess and clean training data for misrepresentation
3. Create workplace cultures that actually support women, not just recruit them
4. Implement stronger protections against digital crimes that disproportionately target women
5. Ensure women are represented in leadership positions where key decisions are made
AI is going to reshape our world in profound ways, and we can't afford to build it with the same old biases. We need diverse perspectives not just because it's the right thing to do, but because it's the smart thing to do. After all, if we're creating technology meant to serve all of humanity, shouldn't all of humanity have a say in how it's built?
The future of AI doesn't have to mirror our past. But changing course requires acknowledging where we've gone wrong and taking concrete steps to do better. In the end, the question isn't whether AI will change our world, it's whether we'll use this opportunity to create a world that works better for everyone.