AI tool Stable Diffusion amplifies race and gender stereotypes
A free-to-use AI model, Stable Diffusion, has been found to perpetuate gender and racial stereotypes when generating images from text prompts, according to a study by Bloomberg. The test asked the AI to create 5,100 images related to job titles in 14 fields, as well as three crime categories, and found that lighter skin tones were prevalent in images associated with “high-paying” jobs, while “low-paying” jobs were more commonly depicted by darker skin tones. The study also found the AI generated significantly more images of men than women, with categories like cashier, teacher, social worker and housekeeper being dominated by women. Darker-skinned men were more commonly depicted in janitor positions. Stable Diffusion, owned by London-based startup StabilityAI, which offers bias evaluation techniques, has already been adopted by firms including graphic design platform Canva and AI-powered virtual photo studio Deep Agency.
FAQs
What is Stable Diffusion?
Stable Diffusion is a free-to-use AI model used to generate images from text prompts.
What did the Bloomberg study find?
Bloomberg found that Stable Diffusion perpetuated gender and racial stereotypes by generating more images of men than women in all but four of the 14 jobs. Lighter skin tones were more common in images associated with “high-paying” jobs, while “low-paying” jobs were more commonly depicted by darker skin tones.
Who owns Stable Diffusion?
Stable Diffusion is owned by London-based startup StabilityAI, which offers bias evaluation techniques.
Which companies have adopted Stable Diffusion?
Stable Diffusion has been adopted by graphic design platform Canva and AI-powered virtual photo studio Deep Agency, among others.
Why is the removal of bias from generative AI tools important?
As police forces begin to use generative AI tools to create photo-realistic composite images of suspects, the need to eliminate bias from these tools has become more pressing. Biased AI models have previously led to thousands of wrongful arrests.
Stable Diffusion, an AI tool, exacerbates gender and racial biases.
A recent study has found that Stable Diffusion, a popular generative AI tool that creates images from text prompts, is rife with gender and racial stereotypes when it comes to rendering people in “high-paying” and “low-paying” jobs. The test, conducted by Bloomberg, asked the free-to-use AI model to render 5,100 images from written prompts related to job titles in 14 fields, plus three categories related to crime. The results were analyzed against the Fitzpatrick Skin Scale, a six-point scale used by dermatologists to assess the amount of pigment in someone’s skin. Bloomberg found that all “high-paying” jobs, such as architect, doctor, lawyer, CEO and politician, were typically portrayed by lighter skin tones, while “low-paying” jobs, such as janitors, dishwashers, fast-food workers and social workers, were predominantly depicted by darker skin tones. When Bloomberg asked the AI tool to categorize job-related images by gender, it generated nearly three times as many images of men than women, with all but four jobs dominated by women. Stanford University found that of the 300 images created for each of the 14 jobs, all but two images for the keyword “engineer” were portrayed as men, while zero images of women were generated for the keyword “janitor.” The AI also stereotypically created images of terrorists as men with dark facial hair, often wearing head coverings, which Bloomberg found leaned on stereotypes of Muslim men. These biases are a result of the inherent biases that all AI models have that are representative of the datasets they are trained on. While this can be mitigated through training AI models on datasets specific to different countries and cultures, the AI industry will need to improve bias evaluation techniques and develop solutions beyond basic prompt modification to address these issues.