HomeEthics & SocietyStudy finds brain reacts in a different way to human and AI...

Study finds brain reacts in a different way to human and AI voices

A brand new study shows that while humans struggle to tell apart human and AI voices, our brains respond in a different way after we hear them.

As AI voice cloning becomes more advanced, it raises ethical and safety concerns that humans weren’t exposed to before.

Does the voice on the opposite end of the phone call belong to a human, or was it generated by AI? Do you think that you’d give you the option to inform?

Researchers from the Department of Psychology on the University of Oslo tested 43 people to see if they might distinguish human voices from those who were AI-generated.

The participants were equally bad at accurately identifying human voices (56% accuracy) and AI-generated ones (50.5% accuracy).

The emotion of the voice affected how likely they might accurately discover it. Neutral AI voices were identified with 74.9% accuracy in comparison with only 23% accuracy for neutral human voices.

Happy human voices were accurately identified 77% of the time while completely happy AI voices were identified with a concerning low 34.5% accuracy.

So, if we hear an AI-generated voice that sounds completely happy we’re more prone to assume it’s human.

Spectrograms showing how similar human and AI voices are. Source: FENS Forum / Christine Skjegstad

Even though we consciously struggle to discover an AI voice accurately, our brain seems to select up on the differences on a subconscious level.

The researchers performed fMRI scans of the participants’ brains as they listened to different voices. The scans revealed significant differences in brain activity in response to the AI and human voices.

The researchers noted, “AI voices activated the appropriate anterior midcingulate cortex, right dorsolateral prefrontal cortex and left thalamus, which can indicate increased vigilance and cognitive regulation.

“In contrast, human voices elicited stronger responses in the appropriate hippocampus in addition to regions related to emotional processing and empathy corresponding to the appropriate inferior frontal gyrus, anterior cingulate cortex and angular gyrus.”

We might find it difficult to know if a voice is AI-generated or human, but our brain seems in a position to tell the difference. It responds with heightened alertness to AI voices and a way of relatedness when listening to a human voice.

The participants rated human voices as more natural, trustworthy, and authentic, especially the completely happy voices and pleasure expressions.

Doctoral researcher Christine Skjegstad who conducted the study together with Professor Sascha Frühholz said, “We already know that AI-generated voices have turn out to be so advanced that they’re nearly indistinguishable from real human voices.

“It’s now possible to clone an individual’s voice from just a number of seconds of recording, and scammers have used this technology to mimic a loved one in distress and trick victims into transferring money.

“While machine learning experts have been developing technological solutions to detect AI voices, much less is understood in regards to the human brain’s response to those voices.”

This research indicates that our brain senses something isn’t quite right when it processes an artificial voice and becomes more cautious.

We might have a bit more help than that as AI-generated voices turn out to be more ‘human’.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read