<p><em>By Parmy Olson</em></p>.<p>It was hard not to cringe at the demo of ChatGPT’s latest upgrade. Instead of showcasing a chatbot that sounded more reliable, OpenAI gave the world one that hit all the notes of an obsequious female, giggling at the antics of its male researchers and praising their outfits. The resulting outrage over the voice’s similarity to Scarlett Johansson missed a deeper point. The world’s leading AI builders are creating software that reinforces stereotypes about women. And there’s a big reason why: There are simply too few of them involved.</p><p>At OpenAI, just 18 per cent of staff working on the development of its technology are women, according to data collected for Bloomberg Opinion by business intelligence firm Glass.ai, which used machine-learning technology to scrutinize tech-company websites and thousands of LinkedIn profiles of AI-focused employees. The creator of ChatGPT was the worst among other leading firms in the survey, conducted in May 2024.</p>.<p>Although OpenAI’s chief technology officer, Mira Murati, is a woman (and briefly was chief executive officer during last year’s drama when Sam Altman was fired), just 122 of the company’s 686 staff whose job involves building AI systems are female. The discrepancy was even worse a year ago, when Glass.ai did a similar survey of LinkedIn profiles, finding that women made up just 12 per cent of OpenAI’s research staff.</p><p>When women and ethnic minorities don’t play a role in building critical technology like AI, there’s far less chance that someone will call out potential bias before it’s baked into a system. Amazon.com Inc., for instance, once used a recruitment algorithm that weeded out the curriculum vitae of women because it had been trained on those submitted mostly by males. That might not have happened if more women had worked on that system at Amazon. Algorithms that scan chest X-rays have been shown to systematically underdiagnose female patients, according to the book Invisible Women by Caroline Criado-Perez, again because so many medical AI models are trained on data that skews male. </p>.Quantum education: Get ready for tomorrow.<p>The latest crop of generative AI tools is just as egregious. Image generators have made women appear more sexualized than men, while an investigative report by Bloomberg News found that Stable Diffusion, the open-source AI image maker, tended to forget women existed altogether. It produced three times more images of men than women. Men dominated pictures of high-paying occupations, like engineers, CEOs or doctors, while women were depicted in low-paying jobs like housekeepers and cashiers. </p><p>None of this will surprise female AI researchers familiar with the field’s legacy of objectifying women. Even before the recent generative AI boom, academics were known to test the performance of their models by using them to put makeup on images of women’s faces, or by swapping out their jeans for miniskirts, according to a recent blog post by Sasha Luccioni, a researcher at open-source AI firm Hugging Face. </p><p>Whenever she spoke up about these methods, Luccioni says she faced pushback. “It was just a benchmark after all,” she writes, pointing out that in academia, women are just as woefully underrepresented, making up 12 per cent of machine-learning researchers. </p>.Count them in.<p>This is the kind of complex problem that takes years to solve thanks to its roots in the educational system and systemic cultural norms. But OpenAI and its peers could hinder many modern-day efforts to level the playing field, like bringing more girls and women into STEM industries, if their systems continue to perpetuate stereotypes. Rooting out the bias in the data they use to train their algorithms is one step toward fixing the problem. Another is to simply hire more female researchers to resolve the chronic imbalance. Expect their future products to be more cringeworthy — and even harmful — if they don’t. </p>
<p><em>By Parmy Olson</em></p>.<p>It was hard not to cringe at the demo of ChatGPT’s latest upgrade. Instead of showcasing a chatbot that sounded more reliable, OpenAI gave the world one that hit all the notes of an obsequious female, giggling at the antics of its male researchers and praising their outfits. The resulting outrage over the voice’s similarity to Scarlett Johansson missed a deeper point. The world’s leading AI builders are creating software that reinforces stereotypes about women. And there’s a big reason why: There are simply too few of them involved.</p><p>At OpenAI, just 18 per cent of staff working on the development of its technology are women, according to data collected for Bloomberg Opinion by business intelligence firm Glass.ai, which used machine-learning technology to scrutinize tech-company websites and thousands of LinkedIn profiles of AI-focused employees. The creator of ChatGPT was the worst among other leading firms in the survey, conducted in May 2024.</p>.<p>Although OpenAI’s chief technology officer, Mira Murati, is a woman (and briefly was chief executive officer during last year’s drama when Sam Altman was fired), just 122 of the company’s 686 staff whose job involves building AI systems are female. The discrepancy was even worse a year ago, when Glass.ai did a similar survey of LinkedIn profiles, finding that women made up just 12 per cent of OpenAI’s research staff.</p><p>When women and ethnic minorities don’t play a role in building critical technology like AI, there’s far less chance that someone will call out potential bias before it’s baked into a system. Amazon.com Inc., for instance, once used a recruitment algorithm that weeded out the curriculum vitae of women because it had been trained on those submitted mostly by males. That might not have happened if more women had worked on that system at Amazon. Algorithms that scan chest X-rays have been shown to systematically underdiagnose female patients, according to the book Invisible Women by Caroline Criado-Perez, again because so many medical AI models are trained on data that skews male. </p>.Quantum education: Get ready for tomorrow.<p>The latest crop of generative AI tools is just as egregious. Image generators have made women appear more sexualized than men, while an investigative report by Bloomberg News found that Stable Diffusion, the open-source AI image maker, tended to forget women existed altogether. It produced three times more images of men than women. Men dominated pictures of high-paying occupations, like engineers, CEOs or doctors, while women were depicted in low-paying jobs like housekeepers and cashiers. </p><p>None of this will surprise female AI researchers familiar with the field’s legacy of objectifying women. Even before the recent generative AI boom, academics were known to test the performance of their models by using them to put makeup on images of women’s faces, or by swapping out their jeans for miniskirts, according to a recent blog post by Sasha Luccioni, a researcher at open-source AI firm Hugging Face. </p><p>Whenever she spoke up about these methods, Luccioni says she faced pushback. “It was just a benchmark after all,” she writes, pointing out that in academia, women are just as woefully underrepresented, making up 12 per cent of machine-learning researchers. </p>.Count them in.<p>This is the kind of complex problem that takes years to solve thanks to its roots in the educational system and systemic cultural norms. But OpenAI and its peers could hinder many modern-day efforts to level the playing field, like bringing more girls and women into STEM industries, if their systems continue to perpetuate stereotypes. Rooting out the bias in the data they use to train their algorithms is one step toward fixing the problem. Another is to simply hire more female researchers to resolve the chronic imbalance. Expect their future products to be more cringeworthy — and even harmful — if they don’t. </p>