HomeIndustriesCan machines be “truer” than humans?

Can machines be “truer” than humans?

Stay up to this point with free updates

It is claimed that truth is singular and lies are plural, giving disinformation an unfair numerical advantage. But is truth really singular?

Let's take the stories of our own lives. Which is the truest version? The official version on our resume or LinkedIn profile? Or the one we tell ourselves? Or the one our family and friends say about us behind our backs? They can all be true – or misleading – at the identical time.

The concept that multiple truths might be drawn from the identical material is brilliantly explored in the movie , which I saw last week. The documentary is predicated on the lifetime of the multi-talented music producer Brian Eno, is routinely generated by a machine and varies with each screening.

According to the filmmakers, there are 52 trillion possible versions of it, which might make “a extremely big box.” This artistic experiment tells us so much concerning the nature of creativity and the plurality of truth within the age of generative media.

For the film, producer Gary Hustwit and artistic technologist Brendan Dawes digitized over 500 hours of Eno's video footage, interviews and recordings. From this archive, which spans 50 years of Eno's creative output collaborating with artists equivalent to the Talking Heads, David Bowie and U2, two editors created 100 scenes. The filmmakers wrote software that generated introductory and shutting scenes featuring Eno and laid out a rough three-act structure. They then unleashed the software on this digital archive, stitching together various scenes and recordings to create a 90-minute film.

Critics generally found the film – or movies – strange and compelling, as did Eno himself. It can seem just a little random, Dawes tells me, but audiences were still in a position to absorb the ingredients and construct a narrative of their heads. Only “the audience cooks,” he adds.

The version I saw was a captivating mixture of interviews and photographs, with some jagged juxtapositions but a transparent narrative arc. I used to be particularly fascinated by a piece where Eno talked concerning the concept of “scenius.” Eno has long resisted the concept that creativity is the results of a single genius, but reasonably the product of collective societal intelligence, or scenius. “The film is the embodiment of this concept of ​​scenius,” says Dawes.

Hustwit and Dawes have now began an organization called Anamorph to use their generative software to other varieties of content. Target customers include Hollywood studios, promoting agencies and sports franchises. However, Dawes stresses that they use their very own proprietary software to reinterpret existing human-created content in unique ways; they don’t use generative AI models like OpenAI's GPT-4 to generate alternative content.

However, the increasing prevalence of generative AI models raises further questions on veracity. Much has been written about how flawed models can generate falsehoods and “hallucinate” facts. This is a significant drawback if a user desires to create a legal opinion. But it could actually be a bonus feature when creating fictional content.

To investigate how good it’sNina Beguš, a researcher on the University of California, Berkeley, commissioned 250 human writers in 2019 and 80 generative AI models last 12 months to put in writing short stories based on similar specifications. The challenge was to reinterpret the Pygmalion myth, wherein a human creates a man-made human and falls in love with him.

Beguš tells me that she was surprised that the machine-generated content, while more formulaic and fewer imaginative, was nonetheless more buoyant. Rather than reinforcing societal stereotypes, the models appeared to challenge them. In their stories, for instance, more of the creators were women and more of the relationships were same-sex in nature.

She suspects these results reflect the way in which human programmers have refined the models, even though it's hard to inform due to opacity of the models. But she says we've now reached a “latest frontier” of writing, where human and non-human generated content have actually merged.

This raises concerns about how dominant U.S. AI corporations are in embodying the values ​​of this latest frontier, which could clash with other societies. “Will these hegemonic models and cultures prevail in every single place?” she asks.

Whether machines improve or degrade truthfulness and what human values ​​they reflect is determined by how they’re designed, trained and deployed. We had higher pay close attention.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read