Log In

AI Development Needs More Women. Here's What Leaders Can Do About It

Published 5 days ago5 minute read

AI accountability, accuracy and societal impact could improve with more women AI developers.

getty

The gender gap in generative AI isn’t just about who uses the tools—it’s also about who builds them. AI developers make crucial decisions—designing models, selecting training data, determining data usage, developing testing protocols—that affect AI’s accountability, accuracy, and societal impact.

Just 31% of AI professionals are women, according to a 2024 LinkedIn analysis. In areas such as algorithm development, machine learning and text mining, the representation gap is even wider.

The gender gap in AI expertise also plagues AI research: all-male teams author 75% of scientific publications. This is according to an analysis of over 74,000 AI-related scientific articles in fields like physics, math, computer science and engineering.

Yet, women prioritize responsible AI values more according to a survey of 743 individuals from the general US population, 755 crowd workers, and 175 AI practitioners conducted by Cornell University, Harvard University and Microsoft Research. They found that women were 4-5 percentage points more likely than men to rate AI values like privacy, safety, accountability, fairness and human autonomy as very or extremely important.

Here’s how more gender diversity in AI development teams could ensure more accountable, accurate and socially responsible AI technologies—and how leaders can increase the representation of women in their AI teams.

AI machines sound excessively confident and knowledgeable, but they can make mistakes. And when mistakes occur, AI firms and even technical experts may be hard-pressed to explain why.

Bad actors may corrupt AI technologies by feeding false information into AI training data or maliciously controlling AI decisions. Elon Musk’s generative AI, Grok, made headlines for inserting unprompted white genocide mentions after an unauthorized code modification. AI accountability can reduce such risks.

Adding professional AI women’s voices to the development process could increase the accountability of AI systems and minimize the risks of opacity, bias and manipulation.

AI’s gender biases—in healthcare, hiring, criminal justice and access to financing—are well-documented and often stem from biased training data. But gender bias may also stem from AI inaccuracies being tolerated more when they concern women. For example, wrist-worn Fitbit devices may be less accurate for women, because Fitbit’s AI is less good at detecting the arm movements of people whose stature and stride length are shorter, according to the Centers for Disease Control and Prevention study.

AI development teams that are dominated by men may simply be oblivious or disinterested in features that could be valuable for women. For example, Apple’s health app tracked an extensive number of metrics when it was launched in 2014, but the app notably had no metrics on women’s menstrual cycles.

While the composition of Fitbit or Apple health app development teams is not publicly available, it stands to reason that more gender-inclusive AI development teams could avoid such oversights and blind spots.

For all of its potential, AI comes with steep environmental and societal costs. The large language models that fuel generative AI require extensive resources and have a considerable carbon footprint. AI tools can make people more productive, but not necessarily happier at work, potentially. And concerns about AI’s role in eroding critical thinking are mounting, especially in the field of education.

Women, on average, tend to think about a broader range of stakeholders in decision-making, including the environment, and anticipate more risks. This may explain why AI research teams with at least one woman are more likely to explore topics with broad societal relevance—like fairness, human mobility and misinformation, according to the analysis of AI publications.

The societal implications of AI reinforce the importance of ensuring that AI development teams consider a wide range of stakeholders, something more gender-diverse teams are more likely to do.

Strategies and techniques for increasing the representation of women in AI development roles align with those intended to increase female representation in STEM more generally. Here are some strategies you could follow:

In doing so, you can support the slow but upward trajectory in women’s completion of technical degrees. For example, the 2025 executive order “Advancing artificial intelligence education for American youth”—which aims to promote AI literacy in K-12—offers concrete opportunities for narrowing the gender gap. Early exposure to STEM programs seems to encourage more girls to pursue technical degrees.

Firms can deploy a number of strategies to stand out as an employer of choice for female AI professionals. Check that your job ads reach a broad range of potential candidates. Use gender neutral job descriptions to increase the number of women applicants. Consider how your recruiting process might correct for women’s tendencies to report fewer technical skill differences than men. Involve technical women in the interview process: female job candidates who interview with female role models are more likely to accept job offers.

Recruiting and retention efforts without buy-in and culture change are unsustainable. Communicate the innovation value of AI development teams that represent a diversity of perspectives and lived experiences to get people on board. Invest in sharpening AI professionals’ inclusive leadership skills. Adopt interventions proven to reduce biases against women at work.

As our understanding of the ethical and societal ramifications of GenAI grows, one thing is clear: AI teams must listen to more diverse voices to improve the accountability, accuracy and societal impact of the tools they develop.

Origin:
publisher logo
Forbes
Loading...
Loading...
Loading...

You may also like...