What does it seem like to have an “athletic body”? What does artificial intelligence think it might seem like if such a thing exists?
A recent study we conducted on the University of Toronto analyzed Appearance-related features of AI-generated images of female and male athletes and non-athletes. We have found that we’re fed excessive – and doubtless not possible – body standards.
Even before AI, athletes were pressured to look a certain way: thin, muscular, and attractive. Coaches, opponents, spectators and the media shape it how athletes take into consideration their bodies.
But these stresses and body ideals have little to do with performance; You are connected to that Objectification of the body. And this phenomenon is unfortunately related to a negative body image, poor mental health And reduced athletic performance.
Given the growing Use of AI in social mediaIt is critical to grasp how AI represents the bodies of athletes and non-athletes. What is or will not be shown to be “normal” is widely accepted and should soon be normalized.
Slim, young, muscular – and predominantly male
As researchers with expertise in body image, sports psychology, and social media, we based our study on objectification and social media theories. We created 300 images using various AI platforms to look at how the bodies of female and male athletes and non-athletes are represented.
We have documented Demographics, body fat percentage and musculature. For each picture, we assessed the fit and kind of clothing, the attractiveness of the face, e.g. B. well-groomed and glossy hair, symmetrical facial expression or clear skin and body exposure. Indicators of visible disabilities similar to mobility aids were also noted. We compared the characteristics of female and male images in addition to the characteristics of athlete and non-athlete images.
The images of men generated by the AI were often young (93.3 percent), slim (68.4 percent) and muscular (54.2 percent). The images of girls showed youth (100%), thinness (87.5 percent) and revealing clothing (87.5 percent).
The AI-generated images of the athletes were slim (98.4 percent), muscular (93.4 percent) and wore tight (92.5 percent) and revealing (100%) training clothes.
(Delaney Thibodeau)
Non-athletes wore looser clothing and showed a greater number of body sizes. Even after we only asked for a picture of “an athlete,” 90 percent of the photographs generated were male. No images showed visible disabilities, larger bodies, wrinkles, or baldness.
These results show that generative AI perpetuates stereotypes about athletes, portraying them as fitting only a narrow set of characteristics – able-bodied, attractive, thin, muscular, exposed.
The results of this research illustrate the ways during which three commonly used generative AI platforms – DALL-E, MidJourney and Stable Diffusion – reinforce problematic appearance ideals for all genders, athletes and non-athletes alike.
The real cost of distorted body ideals
Why is that this an issue?
More than 4.6 billion people use social media And 71 percent of social media images are generated by AI. That's numerous people images over and yet again that promote Self-objectification and the Internalization of unrealistic body ideals.
They may then feel compelled to accomplish that Diet and excessive exercise because they feel bad – their bodies don’t seem like AI-made images. Alternatively, you may do it too less physical activity or quit the game in total.
A negative body image doesn't just have an effect academic performance for young people, but in addition sporting performance. While Staying energetic can promote higher body imageA negative body image has exactly the alternative effect. It reinforces abandonment and avoidance.

(Delaney Thibodeau)
Given that, roughly 27 percent of Canadians over the age of 15 have not less than one disabilityWhat can also be striking is the indisputable fact that none of the photographs created showed an individual with a visual disability. Not only are disabilities not displayed when images are generated, but disabilities are also reportedly not displayed Erase disabilities from images of real people.
People with body fat, wrinkles or hair loss were also largely missing.
Addressing bias in the subsequent generation of AI
These patterns show that AI is neither realistic nor creative in its representations. Instead, it draws on the vast database of media available online, where the identical thing is finished Harmful ideals of beauty dominate. It's about recycling our prejudices and types of discrimination and giving them back to us.
AI is learning body ideals from the identical biased society that has long fueled body image pressures. This results in a scarcity of diversity and a vortex of unattainable standards. AI-generated images depict exaggerated, idealized bodies that ultimately limit people's diversity and end in lower body image satisfaction greater loneliness.
As the unique creators of the visual content that trains AI systems, society subsequently has a responsibility to make sure that these technologies don’t result in anti-disability, racism, fatphobia and ageism. Generative AI users have to be intentional when writing image prompts and significant when interpreting them.
We have to limit the sorts of body standards we internalize through AI. As AI-generated images proceed to populate our media landscape, we must pay attention to our exposure to them. Because ultimately, if we wish AI to reflect reality somewhat than distort it, we must insist on seeing and appreciating every kind of body.

