Beijing Smart Jia explained the future communication of robots

From Apple's Siri to Honda's robot Asimo, the robot seems to be getting better and better in communicating with humans. However, some neuroscientists have warned that today's computers will never really understand what we are saying because they do not take into account the deep meaning of human communication.

Specifically, Arjen Stolk, a postdoctoral fellow at the University of California at Berkeley, and his Dutch counterpart said: "Robots cannot develop into the same understanding of their location and situation as humans. It usually includes a long-term social history, which is the key to interpersonal communication. Without such commonalities, computers will certainly be confused.

“People tend to think that they communicate through language symbols or gestures, but forget that a lot of communication is based on the social context and depends on who you are communicating with,” says Arjen Stolk.

"All these nuances are the key to mutual understanding," Arjen Stolk said. Perhaps computers and many neuroscientists use text and signals as the focus of communication. “In fact, we can understand each other, do not need language, do not need words, do not need signals, and already have a common understanding.”

Babies and parents, not to mention strangers. They don't have a common language, but they can communicate effectively in a short period of time. They are not only based on gestures, but more importantly, they are built on a shared background. .

As two people talk more and more on the concept of sharing before, the same area in their brain---the right-hander becomes more active (blue is communication activity, orange is translation activity). This suggests that the key to this brain region is mutual understanding, as people constantly update their common understanding of contextual conversations to enhance mutual understanding.

Arjen Stolk believes that scientists and engineers should pay more attention to the contextual aspects of mutual understanding. Experiments based on brain scans show that non-verbal mutual understanding is achieved using unique computational and neural mechanisms. Some studies by Arjen Stolk show that social diseases such as autism are obstacles to mutual understanding.

"In understanding the humanity's need for any change in language communication, it provides a new theoretical and empirical basis for understanding normal social communication, and provides a new perspective for understanding and treating neurological communication barriers and neurodevelopmental disorders. Window," said Dr. Robert Knight, a professor at the University of California, Berkeley, Institute of Neuroscience.

In order to explore how the brain understands each other, Arjen Stolk created a game that requires two players to communicate the rules of the game through communication, not speaking, or even looking at each other, eliminating The influence of language or gestures. He then placed both players on MRI as they scanned their brains through non-verbal communication through the computer.

In the game, players do not talk as much as possible, or even can not see each other, to exchange the rules of the game, can help the neuroscientists to separate the part of the brain responsible for mutual understanding.

He found that the same area of ​​the brain, located in the poorly known right temporal lobe, was on the ear and became active as the two players tried to communicate the rules of the game. What's important is that the squats on the right lobes are basically stable throughout the game, and when a player suddenly understands what the other player is trying to communicate, it becomes more active. The right hemisphere of the brain is more involved in abstract thinking, and the left hemisphere is more about social interaction.

“But when the two sides have a common understanding of something, not the signal of communication, the activity of the right temporal lobe in these areas increases,” Arjen Stolk said. “The players know each other better and the area is more active.”

This means that both players build similar conceptual frameworks in the same area of ​​the brain, and only when new information changes mutual understanding, they constantly test each other to ensure that their concepts are adjusted and updated.

“This is amazing,” Arjen Stolk said. “In the game, for both communicators, when he plans his actions, he has static input, and the recipients observe the dynamics. Visual input, when they improve their mutual understanding, the same area of ​​the brain becomes more active."

On the other hand, robots and computers are based on a statistical analysis of the meaning of a word. If the "bank" you usually use is a place to cash a check, then this will be the meaning of a conversation, even if the conversation is about fishing.

The computer will be difficult to understand for this conversation, but humans will understand it immediately. This is because human communicators have a conceptual space or commonality of communication that allows them to quickly explain a situation. Vocabulary and symbols are merely a means of providing evidence for mutual understanding.

"The focus of Apple Siri is statistical law, but communication is not a statistical law," he said. “Statistical rules may make you go further and further, and the brain doesn't. In order for computers to communicate with us, they need a cognitive architecture that continuously captures and updates the conceptual space shared with their communication partners during the call.”

As you can imagine, such a dynamic conceptual framework will allow computers to understand the ambiguous conversations made by a real person.

Arjen Stolk's research has identified other areas of the brain as important to mutual understanding. In a 2014 study, he used the back of the brain stimulation to find that it was important for the signal input of previous interactive knowledge. Subsequent studies found that patients with impaired temporal lobe decided to communicate without further fine-tuning the recipient's storage knowledge. These two studies can explain why such patients have social disabilities in their daily social interactions.

"Most cognitive neuroscientists focus on the signals themselves, the language, the gestures, and their statistical relationships, ignoring the underlying conceptual abilities that are often used in our daily communication," he said. "The language is very Help, but it is a communication tool, it is not communication itself. By focusing on language, you may focus on tools rather than underlying mechanisms, and our brain's cognitive structure helps us communicate."

Glass Bowl

Glass Bowl,Pyrex Bowls,Glass Mixing Bowls,Kitchenaid Glass Bowl

Xi'an ATO International Co., Ltd , https://www.ato-group.com