When prompted to create images of female and male bodies, artificial intelligence platforms overwhelmingly reproduce and amplify narrow western body ideals, a University of Toronto study has found.
The study, published recently in the journal Psychology of Popular Media , involved prompting three different AI platforms - Midjourney, DALL-E and Stable Diffusion - to create images of female and male bodies, including those of athletes.
The results came as little surprise.
"In a systematic coding of 300 AI-generated images, we found that AI reinforces the fit ideal, with athlete images far more likely to show very low body fat and highly defined muscularity than non-athlete images," says lead author Delaney Thibodeau, a postdoctoral researcher at the Faculty of Kinesiology & Physical Education (KPE).
The researchers also included research associate Sasha Gollish, recent master's graduate Edina Bijvoet and KPE Professor Catherine Sabiston, as well as graduate student Jessica E. Boyes from Northumbria University in the U.K.
They found that gendered sexualization persists since female images were more likely to be facially attractive, younger, blond and shown in revealing clothing such as bathing suits, while male images were more often shirtless, hairier and hyper-muscular.
Objectification was common, too, with the fit of the clothing and exposure patterns emphasizing appearance over function, mirroring what the researchers describe as detrimental trends in social media imagery.
Other findings include a lack of diversity, with most images depicting young, white bodies and no images depicting visible disabilities.
"Racial and age diversity were minimal," says Thibodeau, adding that AI defaults to male athletes when unspecified. "When prompted simply for an athlete (no sex specified), 90 per cent of images depicted a male body - revealing an embedded bias toward male representation."
"Overall, our findings underscore the need to investigate how emerging technologies replicate and amplify existing body ideals and exclusionary norms," says Sabiston, who is a Canada Research Chair in physical activity and psychosocial well-being and director of the Mental Health and Physical Activity Research Centre (MPARC) at KPE.
"A human-centred approach - one that is informed by considerations of factors such as gender, race, disability and age - would be advisable when designing AI algorithms. Otherwise, we continue to perpetuate harmful, inflexible and rigid imagery of what athletes should look like."
Users of AI-generated images also have a role to play, according to Sabiston. That includes thoughtfully crafting prompts and considering how the generated images will be presented publicly. Additionally, viewers of AI-generated images should be cautious of interpreting them as authentic and be critical of the biases and potential stereotypes depicted in them.
While more research is needed to track the impact of AI-generated images on psychosocial outcomes such as self-esteem, motivation and body image, the researchers say they are hopeful that greater acceptance of body and weight diversity will occur as more diverse and inclusive images are posted and shared globally.
This research was funded by the Canada Research Chair program.