HomeArtificial IntelligenceWhy robots may be culturally insensitive - and the way scientists try...

Why robots may be culturally insensitive – and the way scientists try to repair the issue

A robot chats with an elderly British man in his bedroom. The robot has a cheerful demeanor and a pleasantly high voice.

The robot – perhaps because of the person's age – begins to ask him about his memories of the Second World War: “Please tell me, what was essentially the most difficult thing you and your loved ones needed to undergo?” The older man goes on to say that his father was within the Royal Air Force and so they didn't see him for nearly 4 years.

But why was a robot asking him bluntly about one of the traumatic experiences he had ever had? The robot's behavior was the product of the Caresses Project (Culture-aware robots and environmental sensor systems to support older people).

This project suits into the brand new field of “cultural robotics,” which goals to design robots that may bear in mind the cultural background of the person they’re chatting with and adapt their behavior accordingly. That's why the robot chats concerning the war. Since the person was British, it was assumed he would have an interest.

In the long run, we are able to expect robots for use increasingly incessantly in our personal and social lives. There is currently energetic research in areas as diverse as Delivery robots for supermarkets, Entertainment robots, Healthcare service robots, Collection robot for warehouses, Dementia support robots, Robots for people on the autism spectrum And Care robots for older people.

There is even Robot priest that may deliver blessings in five languages, and Robot monks This can enlighten people about Buddhism.

Cultural stereotypes

Cultural robotics is an element of a broader movement geared toward making AI and robotics more culturally inclusive.

Concerns have already been raised about this move. For example, large language models (LLMs), comparable to those utilized by OpenAI's ChatGPT, are trained on massive amounts of text. However, because the Internet continues to be predominantly English, LLMs are trained totally on English-language texts – with the cultural assumptions and prejudices contained therein..

Similarly, the move to make robots and AI more culturally sensitive is well-intentioned, but we’re concerned about where it would lead.

For example, a study compared the cultural preferences of China, Germany and Korea to attract conclusions about what people in these countries would really like their robots to appear like.

Drawing on previous work on cultural preferences, they suggested that more “masculine” societies are likely to think “big and fast” things are beautiful, while more “feminine” societies think “small and slow” things are beautiful. They pointed to work purporting to indicate that Korean culture exhibits “medium masculinity” while German culture exhibits “high masculinity” and hypothesized that Koreans usually tend to find service robots (which are likely to be small or medium-sized and slow) . friendly.

Another study compared the private spatial preferences of Germans and “Arabs”. But these items usually are not comparable. “Arab” is a potentially offensive term for many individuals and may be used to explain people from many alternative cultural and national backgrounds. It is actually not the identical as categories like “German,” which is a non-offensive term for people of a single nationality.

It can also be becoming increasingly clear that folks react otherwise to robots depending on their cultural background. For example, different cultures have different expectations around personal spaceand this affects how far-off robots needs to be from them.

Different cultures also interpret facial expressions otherwise. A study found that folks can understand a robot higher when it communicates with facial expressions they’re acquainted with.

Another way?

If we wish to avoid designing robots based on broad and gross generalizations and stereotypes, we’d like a more nuanced approach to culture in robotics.

Culture is a notoriously fuzzy and nuanced term that’s open to many interpretations. A survey is over 300 possible definitions of culture.

In our current researchWe argued that culture is “conceptually fragmented.” In short, we imagine that there are so many alternative ways of understanding culture and so many differing types of robots that we should always not expect there to be a one-size-fits-all approach.

We imagine that different applications inside robotics would require radically different approaches to culture. For example, imagine an entertainment robot in a theater tasked with dancing for the audience.

For this job, the most effective solution to approach the culture is likely to be to deal with what varieties of entertainment people in the realm prefer. This could include asking what dance styles are popular locally and designing the robot accordingly.

Other applications may require a distinct approach to culture. For example, for a robot that is anticipated to interact with the identical small number of individuals over an prolonged time frame (like a service robot in a nursing home), it is likely to be more necessary that the robot changes its behavior over time adapt to the changing preferences of the people it helps.

In this case, perhaps it will be higher to think about culture as something that emerges slowly and dynamically through the interaction of various subjects.

This implies that coping with culture in robotics is prone to be complex, multi-layered and specific to every situation.

If we design robots based on relatively crude stereotypes and sweeping generalizations about different cultures, then we risk spreading these stereotypes.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read