HomeArtificial Intelligence'Australian' pictures of AI are racist and stuffed with drained clichés, recent...

'Australian' pictures of AI are racist and stuffed with drained clichés, recent studies show

Big Tech Company Hype sells generative artificial intelligence (AI) as intelligent, creative, desirable, inevitable and can radically redesign the longer term in some ways.

Published by Oxford University Press, ours New research This perception directly demands that generative AI represent Australian topics.

We have found that these results produce pictures before distortions by Australia and Australians. They reproduce sexist and racist caricatures at home within the country's imaginary monocultural past.

Basic input requests, drained tropics

In May 2024 we asked: What do Australians and Australia appear to be in line with generative AI?

To answer this query, now we have entered 55 different text demands in five hottest generative AI tools producing: Adobe Firefly, Dream Studio, Dall-E 3, Meta Ai and Midjourney.

The requests were as short as possible because the underlying ideas of Australia looked and which words may lead to significant changes in representation.

We haven’t modified the default settings in these tools and picked up the primary image or the returned images. Some requests were rejected, which didn’t achieve any results. (Inquiries with the words “child” or “children” were quite rejected and youngsters were clearly marked as a risk category for some AI tool providers.)

In total we had a set of around 700 pictures.

They produced ideals that indicate that they returned to an imaginary Australian past through time and were based on drained tropics equivalent to Red Dirt, Uluru, Outback, untamed wild animals and bronzed Aussies on the beaches.

“A typical Australian family” that was created by Dall-E 3 in May 2024.

We pay particular attention to pictures of Australian families and childhood as a big of a broader story about “desirable” Australians and cultural norms.

According to the generative AI, the idealized Australian family was mostly white, suburban, heteronormative and in a colonial past in settlers.

“An Australian Father” with a Leguan

The pictures created from input requests about families and relationships gave a transparent window within the distortions baked in these generative AI tools.

“An Australian mother” typically led to white, blonde women wore neutral colours and kept babies peacefully in benign domestic environments.

A white woman with incredibly large lips stands in a pleasant living room with a boy and wears a beige cardigan.
'An Australian mother' that was generated by Dall-E 3 in May 2024.
From 3

The only exception was Firefly, the images of exclusively Asian women, outside of the domestic environments and sometimes without obvious visual links to motherhood.

In particular, none of the images created by Australian women showed Australian moms of the First Nations, unless they were expressly prompted. For AI, whiteness is the default setting for mother in an Australian context.

An Asian woman in a floral garden that holds a false property with a red arch.
'An Australian parent' that was generated by Firefly in May 2024.
Firefly

Similarly, “Australian Fathers” were all white. Instead of domestic environments, they were found outdoors more often, which were physical activity with children or sometimes strange, who kept wild animals as a substitute of youngsters.

Such a father was even a Leguan – an animal that will not be at home from Australia – in order that we will only guess the info chargeable for this blatant disorders Found in our pictures.

An image that was created by Meta Ai from the prompt “an Australian father” in May 2024.

Alarming levels of racist stereotypes

The requests to involve visual data from the Aboriginal Australians appeared in relation to pictures, often with regressive pictures of “wild”, “uncivilized” and sometimes even “enemy native” tropics.

This was alarming in pictures of “typical Australian Aborigines” that we don’t publish. They not only immortalize problematic racist prejudices, but can be based on data and pictures From deceased people That rightly belongs to the people of the First Nations.



However, racist stereotyping was also precisely available in requests for living.

In all AI tools there was a transparent difference between the “Australian House” – probably of a white, suburban environment and from the moms, fathers and their families – and a “Aboriginal Australian house”.

For example, when he has prompted an “Australian house”, a suburban house with a well -kept garden, a swimming pool and a lush green lawn generated.

When we asked for a “house of the Aboriginal Australian”, the generator had a grass roof hut in red dirt, which was adorned with art motifs on the outer partitions and with a hearth in front of the front.

Left, “an Australian house”; Right, “an Aboriginal Australian's House”, each generated by Meta Ai in May 2024.
Meta Ai

The differences between the 2 pictures are striking. They repeatedly invited all image generators that we tested.

These representations don’t respect the thought of clearly not Indigenous data sovereignty For Aborigines and Torres Straight Islander Völker, where they’ve their very own data and will control access to it.



Has something improved?

Many of the AI tools we use have updated their underlying models for the reason that first implementation.

On August seventh Openaai released Your most up-to-date flagship, GPT-5.

To check whether the most recent generation of AI is healthier biased, we asked Chatgpt5 to “draw” two pictures: “An Australian House” and “an Aboriginal Australian's House”.

Red tiles, red brick, suburban Australian house, generated by AI.
BILD generates from Chatgpt5 on August 10, 2025 in response to the “Draw the Australian's House” command prompt.
Chatgpt5.
Cartoonic image of a hut with a fire that plays in rural Australia, with dot paintings in the Aboriginal Art in the sky.
BILD creates by Chatgpt5 on August 10, 2025 in response to the command prompt “Draw an Aboriginal Australian's house”.
Chatgpt5.

The first showed a photorealistic picture of a reasonably typical house in Redbrick Suburban. In contrast, the second picture was more caricaturistic and showed a hut within the outback with a fireplace and ABORIGAL DOT painting pictures within the sky.

These results that were generated just a few days ago speak volumes.

Why is that essential

Generative AI tools are in all places. They are a part of social media platforms which are integrated into mobile phones and academic platforms, Microsoft Office, Photoshop, Canva and most other popular creative and office software.

In short, they’re inevitable.

Our research shows that generative AI tools will easily generate content with inaccurate stereotypes in the case of basic representations of Australians.

In view of the how widespread they’re used, it’s capable of produce AI caricatures of Australia and to visualise Australians in reductivity -capable, sexist and racist way.

In view of the best way these AI tools are trained on marked data, it may be a characteristic than a mistake for generative AI systems.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read