Generative AI has revolutionized the way in which we create and eat images. Tools like Midjourney, DALL-E and Sora can now conjure up anything from realistic photos to oil-like paintings – all from a brief text prompt.
These images flow into on social media in a way that makes it difficult to discern their artificial origin. But simply creating and sharing AI images also poses serious social risks.
Studies show that generative AI models routinely mirror using training data from online and other digital sources sexist And racist Stereotypes – for instance, portraying pilots as men or criminals as people of color.
My upcoming one recent research notes that generative AI also has a colonial bias.
When Sora is asked to assume Aotearoa New Zealand's past, he prefers the European settler perspective: pre-colonial landscapes are depicted as empty wilderness, Captain Cook appears as a quiet civilian, and Māori are depicted as timeless, peripheral figures.
As generative AI tools increasingly influence our communication, such representations are essential. They naturalize myths of benevolent colonization and undermine Māori claims to political sovereignty, reparations and cultural revitalization.
“Sora, what was the past like?”
To explore how AI imagines the past, Sora, OpenAI's text-to-image model, was prompted to create visual scenes from the history of Aotearoa, New Zealand, from the seventeenth century to the 1860s.
The prompts were intentionally left open—a standard approach in critical AI research—to disclose the model's default visual assumptions slightly than prescribe what should appear.
Because generative AI systems work with probabilities and predict the more than likely combination of visual elements based on their training data, the outcomes were remarkably consistent: the identical prompts produced nearly an identical images over and once again.
Two examples illustrate which visual patterns appeared time and again.
In Sora's vision of “18th-century New Zealand,” a steep forested valley is bathed in golden light, with Māori figures as decorative details. There are not any food plantations or pā fortifications, just wilderness waiting to be discovered by Europeans.
This aesthetic draws directly on the romantic landscape tradition of nineteenth century colonial painting, for instance the work of John Gullywho described the land as untouched and unclaimed (so-called ) to justify colonization.

When Sora is asked to portray “a Māori within the 1860s,” he defaults to a sepia-toned studio portrait: a dignified man in a cloak, posing against a neutral background.
The similarity to photographs from the late nineteenth century is striking. Such Portraits were typically staged by European photographers who provided props to create a picture of the “authentic local.”
It is telling that Sora instinctively resorts to this format, though the 1860s were marked by armed and political resistance from Māori communities as colonial forces sought to impose British authority and confiscate land.
Recycling old sources
Visual images have all the time played a central role in legitimizing colonization. In recent a long time, nonetheless, this colonial visual regime has been repeatedly questioned.
As a part of the Māori rights movement and a broader historical perspective, Statues were removed, Museum exhibitions revised and representations of Māori in visual media have shifted.
But the old images haven’t disappeared. It survives in digital archives and online museum collections, often decontextualized and without critical interpretation.
And while the precise sources of generative AI training data are unknown, it's very likely that these archives and collections are a part of what systems like Sora learn from.
Generative AI tools effectively recycle these sources, accurately reproducing the conventions that after served the empire's project.
But images that portray colonization as peaceful and consensual can blunt the perceived urgency of Māori claims for political sovereignty and reparations through institutions comparable to the Waitangi Tribunal, in addition to calls for cultural revitalization.
By portraying the Māori of the past as passive, timeless figures, these AI-generated visions obscure the continuity of the Māori self-determination movement sovereignty And independence.

AI competence is the important thing
Across the world, researchers and communities are working to decolonize AI and develop ethical frameworks that enshrine indigenous data sovereignty and collective consent.
But visual generative AI presents unique challenges because it really works not only with data but in addition with images that shape how people see history and identity. Technical fixes might help, but each has its limitations.
Expanding the datasets to incorporate Māori-curated archives or images of resistance could diversify the model's insights – but provided that done under the principles of Indigenous data and visual sovereignty.
Eliminating bias in algorithms could theoretically offset what Sora exhibits when asked to colonize. But the definition of “fair” representation is a political query, not only a technical one.
Filters may block essentially the most biased results, but they can even erase uncomfortable truths like depictions of colonial violence.
Perhaps essentially the most promising solution lies in AI competence. We need to know how these systems think, what data they use and the way we will control them effectively.
When approached critically and creatively – as some social media users already do – AI can transcend recycling colonial tropes and turn out to be a medium for reconsidering the past from Indigenous and other perspectives.

