AI Is Amplifying Harmful Body Image Stereotypes

AI Is Amplifying Harmful Body Image Stereotypes

As a technologist deeply immersed in the world of artificial intelligence, Laurent Giraid has dedicated his work to understanding not just what AI can do, but what it is doing to us. His recent research peels back the glossy veneer of AI-generated imagery to reveal a disturbing reflection of our own societal biases, particularly concerning body image and athletics. In our conversation, we explore the startling homogeneity of AI-generated athletes, the technical feedback loop that amplifies harmful stereotypes, and the profound psychological cost of digital erasure. Giraid breaks down how platforms like DALL-E and Midjourney are recycling our prejudices and offers a critical perspective on how we, as users and creators, can push back against this rising tide of unrealistic ideals.

Your research uncovered a striking gender bias when you simply asked AI for an image of ‘an athlete,’ with 90% of the results being male. Could you walk us through how you designed this study and share an instance from your work that really brought home the extent of this digital stereotyping?

Certainly. We grounded our study in established frameworks of objectification theory and social media’s influence, aiming to systematically analyze what AI shows us when we ask for images of people. We used three of the most popular platforms—DALL-E, MidJourney, and Stable Diffusion—to generate a total of 300 images, comparing depictions of male and female athletes and non-athletes. For each image, we meticulously documented a list of traits: demographics, estimated body fat and muscularity, clothing style, and even facial attractiveness markers like symmetrical features or clear skin. The moment it truly crystallized for me was seeing the results for the simple, neutral prompt “an athlete.” The screen just filled up, image after image, with an army of hyper-muscular, lean men. It wasn’t a subtle skew; it was an overwhelming default, a stark visual declaration that the very concept of “athlete” is fundamentally male in the AI’s lexicon.

Beyond the stark statistics—like 100% of AI-generated female athletes appearing young and 87.5% wearing revealing clothing—what were some of the more subtle, recurring visual cues you noticed? How do these almost imperceptible details work together to reinforce the objectification of women in sports?

It’s the consistency of the details that’s so telling. It wasn’t just that the clothing was revealing; it was also almost universally tight, found in 92.5% of athlete images. Furthermore, there was an unnerving perfection to their appearance that had nothing to do with athletic function. The AI consistently generated women with neat, shiny hair, flawless skin, and perfectly symmetrical features. This combination creates a specific aesthetic that feels less like a depiction of a powerful, performing athlete and more like a carefully curated ideal designed for observation. This is the very essence of objectification: the body is presented as an object to be looked at, its value tied to its appearance rather than its capability. This pressure to conform to an aesthetic ideal, rather than a performance one, is directly linked to negative body image and can tragically push women out of sports altogether.

It’s profound that across 300 generated images, there was a total absence of visible disabilities, wrinkles, or larger body types. From a psychological standpoint, what is the step-by-step impact of this digital erasure, both for an individual who doesn’t see themselves represented and for our society’s collective understanding of what it means to be athletic?

The psychological cascade is both personal and societal. For an individual, the first step is a feeling of invisibility and invalidation. When you never see bodies like yours portrayed as strong, capable, or even just present in an athletic context, it can foster a deep sense of not belonging. This leads to the internalization of these narrow ideals, which can trigger self-objectification and negative body image. The impact on behavior is next; a person may either engage in unhealthy dieting and over-exercising to chase an impossible standard or, conversely, avoid sports and physical activity entirely because they feel alienated. When you consider that approximately 27% of Canadians over 15 have at least one disability, this total erasure is a massive distortion of reality. For society, this constant flood of homogenous imagery dangerously narrows our definition of health and athleticism, reinforcing damaging forms of discrimination like ableism, ageism, and fatphobia.

The article powerfully states that AI is ‘recycling our prejudices.’ Could you demystify this process for us? How exactly does a societal bias, like the ones prevalent in online media, get absorbed and then magnified by an AI platform like DALL-E or Midjourney, creating this vortex of exaggerated physical ideals?

It’s crucial to understand that these AI systems are not thinking or creating in a human sense. They are incredibly sophisticated pattern-recognition engines that learn from a massive dataset: the internet. The AI scrapes billions of images and the text associated with them. If, over decades, our media has overwhelmingly tagged images of lean, muscular, able-bodied people with the word “athlete,” the AI learns that this pattern is the “correct” answer. It doesn’t just replicate the bias; it refines and concentrates it. Diversity, like images of para-athletes or older athletes, becomes statistical noise that the system filters out in favor of the most dominant, stereotypical representation. This creates a dangerous feedback loop. The AI produces exaggerated ideals based on our biased data, we share these compelling images on social media, and that new content then becomes part of the future data set for the next generation of AI to learn from, making the vortex of unreachable standards even stronger.

You’ve painted a clear picture of the problem. For the everyday content creator or social media user who wants to be part of the solution, what are some tangible, intentional actions they can take to challenge these AI biases and help cultivate a more inclusive and realistic digital landscape?

The power lies in being intentional and critical. First, when writing prompts, be specific and conscious of inclusivity. Instead of just “a runner,” try “a determined female marathon runner with a prosthetic leg crossing the finish line” or “a joyful group of senior women playing pickleball.” By deliberately including descriptors of diversity—age, body size, disability, race—you are actively feeding the system new, more representative patterns. Second, be critical of the output. If the AI generates a stereotypical image, don’t just accept it. Refine your prompt or, more importantly, choose not to use images that perpetuate harmful ideals. We have to remember that we are the original creators of the content that trains these systems. Every time we create, post, and tag an image that reflects a more authentic and diverse reality, we are contributing to a better, more equitable data set for the future.

What is your forecast for the evolution of AI-generated imagery and its impact on body image?

I see two potential paths forward. If we remain passive consumers, the forecast is grim. The feedback loop I described will intensify, and the digital world will become saturated with ever more homogenous, exaggerated, and unattainable body ideals. This could lead to a significant increase in body dissatisfaction, loneliness, and the mental health issues that stem from them as the gap between digital fantasy and physical reality widens. However, there is a more hopeful path. Awareness of AI bias is growing, and conversations like this are becoming more common. If users become more critical, if developers prioritize ethical considerations and diverse training data, and if society makes a conscious effort to value and showcase every kind of body, we can steer this powerful technology. We could use AI not to distort reality, but to help us imagine and create a media landscape that truly reflects the beautiful, diverse tapestry of humanity.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later