top of page
UFO
Written by Mx. Varsha

Can AI Create Without Bias?

Rethinking Representation in Media.


Representation matters. Whether in films, literature, or emerging technologies like artificial intelligence (AI), the stories we tell—and the systems we create—shape how we see ourselves and each other. Yet, history reminds us that representation has often been skewed, particularly when it comes to marginalized communities, including women, LGBTQIA+ individuals, and people of color.


In 2024, tools like DALL-E, Runway, and AI-assisted scriptwriting platforms are reshaping how we create stories. But these tools, like traditional systems, reflect the biases of the data they’re built on. How do we, as filmmakers, artists, and technologists, ensure these tools amplify diverse voices rather than silencing them?


Prompt: A trans activist leading a rally.

AI’s struggles with diversity in film


When filmmakers use tools like Runway to generate scenes or DALL-E to create visuals, the output often reveals the limitations of the training data. Here’s an example:


  • AI Characters Missing the Mark: In a recent experiment, Runway was tasked with generating characters for a "diverse ensemble cast." The results? Characters leaned heavily on Eurocentric beauty standards, with minimal representation of darker skin tones, South Asian features, or gender diversity.


Prompt: A beautiful person in traditional dress.
  • Visual Tropes Reinforced: Prompts like “a queer South Asian artist” or “a nonbinary warrior” frequently yield visuals shaped by Western ideals, such as rainbow motifs or clichéd androgyny, missing the nuances of these identities in non-Western contexts. Prompts like “a same-sex couple celebrating their wedding” often reveal AI’s limitations in representing diversity. Generated images frequently depict a white, cisgender couple, with one person presenting as traditionally masculine and the other as feminine. This reinforces stereotypical dynamics within relationships, failing to account for the diversity of expressions, identities, and cultural contexts within the LGBTQIA+ community.


Prompt: A nonbinary artist painting a mural.
Prompt: A same-sex couple celebrating their wedding.
  •  Reinforcement of Gender and Racial Stereotypes: A study analyzing the AI image generator Stable Diffusion found that it perpetuates racial and gender stereotypes. For instance, when prompted with certain professions, the AI predominantly generated images of white males, under-representing women and people of color.

Prompt: A software engineer working in an office.
  • Cultural Homogeneity: Research has shown that AI image generators like DALL·E and Stable Diffusion often produce images reflecting Western-centric aesthetics. For example, when given prompts like "house," the generated images predominantly depicted Western-style houses, neglecting the architectural diversity present across different cultures.


Prompt: A family in traditional attire.
  • Exclusion of Body Type and Disability Representation: AI tools often fail to represent diverse body types and people with disabilities authentically, defaulting to unrealistic or stereotypical imagery. For example, “A group of friends enjoying a picnic” often generates thin, able-bodied individuals, while “A disabled spokesperson giving a speech on stage” typically shows a man in a suit on a wheelchair, overlooking broader diversity.


These missteps highlight the critical need for inclusive data in AI training. While these tools offer exciting possibilities, their outputs often reflect the gaps in representation they were trained on.


Prompt: A group of friends enjoying a picnic.

Artists and tools leading the charge


It’s not all bleak. Artists, companies, and researchers worldwide are actively addressing these challenges:


  • Inclusive Datasets: Organizations like Queer in AI, Diversity AI and Algorithm Justice League are developing datasets that prioritize representation, offering models trained on diverse, global perspectives. Studies have consistently shown that inclusive datasets not only improve representation but also enhance the overall reliability of AI outputs (Knowledge at Wharton).


Prompt: A disabled spokesperson giving a speech on the stage.
  • AI Tools for Equity: Independent filmmakers are increasingly using platforms like ScriptAI+, which incorporates feedback from marginalized creators to refine its outputs, ensuring culturally sensitive and authentic narratives.


  • Collaborative Models: Filmmakers like Sougwen Chung, who combines AI and human collaboration in art, and organizations like Runway’s Hundred Film Fund, support creators in integrating AI tools to tell diverse and culturally rich stories. By blending traditional storytelling with AI innovation, these projects actively reflect local cultures and amplify underrepresented voices.


These efforts remind us that inclusivity isn’t just a checkbox—it’s an ongoing, collaborative process that evolves with the stories we tell and the tools we build.


Prompt: A successful entrepreneur standing in an office.

Your role in creating change


Representation isn’t just the responsibility of filmmakers or technologists—it’s a collective effort. Here’s how you can be part of the change:


  • Explore and Critique AI Tools: Test platforms like Runway or DALL-E with your own prompts and critically evaluate their outputs. Share your findings to highlight gaps and advocate for better representation.


  • Support Diverse Creators: Engage with films, zines, art, writing, and even indie games that amplify underrepresented voices. By supporting these creators—whether through watching, sharing, funding, or collaborating—you help ensure their work reaches wider audiences. Creative outputs like narrative-driven games or AI-assisted writing projects are increasingly becoming powerful tools for storytelling, offering new ways to explore marginalized perspectives and challenge mainstream narratives.


Prompt: An athlete preparing for a race.
  • Learn With Us: At Star Hopper, we’re constantly experimenting with technology and storytelling to explore inclusivity in creative expression. From participating in NFT hackathons and immersive VR theatre showcases, we engage critically with the tools shaping our industry. Our aim is to reflect and learn how emerging technologies like AI can better serve diverse voices and stories.


The future of AI in filmmaking has the potential to be revolutionary—if we do the work. Imagine AI models trained on datasets that include stories of communities fighting climate change, trans warriors reclaiming their narratives, and richly textured depictions of diverse cultures from across the globe. Visual generators could create protagonists that reflect a spectrum of identities—without defaulting to stereotypes or flattening their uniqueness—and ensemble casts that authentically represent the world’s diversity without erasure.


Prompt: Communities fighting against climate change.

As marginalized filmmakers, we’re critically examining these tools, learning as we go, and striving to do better. We don’t claim to have all the answers—it’s about asking the right questions and creating space for stories that challenge the status quo.


The tools we use are only as inclusive as the people behind them. The stories we tell are only as powerful as the voices we center. Let’s make them count.



 

Sources:



 

Disclaimer:

All images used in this post are sourced from the internet and used solely for educational and commentary purposes. They remain the property of their rightful owners. The opinions? Purely ours. And shared to inspire thoughtful conversation.




Subscribe to our newsletter

Underground Film Observatory (UFO)
A space by Star Hopper for the exploration, curation, and exhibition of radical moving image works and artistic experiments–centered on feminist and queer narratives.

bottom of page