Publish at November 11 2020 Updated March 04 2022

The AI that couldn't talk

Is language the prerogative of Man?

Is language the prerogative of human beings? Is he the only one who can not only speak, but also converse, establish an exchange with another entity?

No, this other entity we are referring to is not from the animal genus, but from artificial intelligence. Today, we will be talking about robots.

Can robots, these A.I., although extremely technologically advanced, nevertheless establish a true dialogue with us? This is what we will try to discover. From man to robot, does the "virtual" language exist or is it just a chimera drowned in this sea of data?

Virtual assistants and us

Alexa, Google, and Siri are now often part of our daily lives. These virtual assistants are integrated into our smartphones, computers, and even our homes with their hardware variations.

We call on them for many things, whether to remind us of an event, an appointment, start a timer, take a note, answer our questions about a definition, a translation, a monetary or length conversion (thank you Alexa for flying to my rescue to translate measurements in feet into meters!), sing us a song, wish us happy birthday, say good morning in the morning and encourage us to have sweet dreams before going to sleep, inform us of the news, the weather, the opening hours of a store, ... In short, a tool that quickly becomes indispensable to the one who rightly uses it.

Yes, we talk to them. They answer us, but is it a conversation? No. It is just an understanding to a question, an interpretation and an appropriate response. The "conversation" ends there. Don't expect, for example, that your virtual friend, after answering you, will continue the conversation with you on the same topic!

IA vs. human

But why can't AIs simply converse with us? Let's take a simple example: the verb "to give". By making the gesture, mimed, to a person, that person will understand, regardless of their language or culture, that holding something out to someone will mean that it is being offered, given to them.

The notion of "giving" will then be intuitively understood and quickly reintegrated. What about a robot? Impossible to reproduce the same pattern! It will have to integrate, through a learning algorithm, thousands of sentences, gathering all the possible combinatories between verbs and situations in context. In an example as basic and banal as the act of "giving", more than a thousand situations are possible.

According to Frédéric Landragin, an expert in computer linguistics at the CNRS and author of the book "How a Robot Speaks," the fact that we can hope to arrive at a first -correct- model of language could take several years, because there, it's not just a matter of having AI process a concept, but of understanding it as a whole.

The Blender Bot project

However, all of this is gradually changing. And these changes, we owe them to... Facebook! In Paris, at the Facebook AI Research (FAIR for friends), a promising research project is being developed. It is called Blender Bot (the blender robot) and boasts of being able to dialogue like a human, on any subject. Info or intoxication?

The sea of data in which this prototype seems to swim is borderline credible: a neural network of 9.4 billion parameters, a corpus of 1.5 billion conversations and above all... a memory that remembers previous exchanges made between the interlocutor (human) and the AI! Thus, it will be able to ask you, for example, how your appointment of the day before went, the one you had asked it to remind you in your calendar! This is almost paranormal or an invasion of privacy! But how is this possible? Have we finally found "the missing link", the one that allows AI to make language "intuitive"?

And tomorrow?

As Frédéric Landragin again points out, "unlike Man, the machine is not endowed with common sense. Result: it struggles to grasp the implicit elements of language".

He gives a concrete example with the sentence "the cup does not fit on the shelf because it is too small." For us, no problem understanding this sentence, but for an AI, this one raises a big problem: Who is "she"? the cup or the shelf. Since both words are feminine, "she" can refer to either.

To assimilate this concept, several referents will have to be analyzed in his database: the models of cuts, the models of shelves, the comparison of their sizes and volumes, but that's not all! The question around gravitation, in this case "what is on what?" also comes into play. After these analyses, the AI will conclude that it is indeed the shelf that is referred to, this one being too small to support the cut on the one hand, and on the other hand this one acting as a support with respect to the object of the cut.

Finally, we can say that even if the AI tends more and more to develop and improve, it is still subject to what its creators want to inculcate it. Today, we can consider that language remains specific to humans. However, nothing says that one day robots cannot think for themselves, we are already in the realm of science fiction...

Sources and illustrations

But why do robots still not know how to speak? Antoine Crochet-Damais, Journal du net, October 2020,

How does a robot talk?, Frederic Landragin, Belial, June 2020

Alexa, Photo by Anete Lusina from Pexels
Robot, Photo of Alex Knight from Pexels
Robot 2, Photo of Suzy Hazelwood from Pexels

See more articles by this author




Access exclusive services for free

Subscribe and receive newsletters on:

  • The lessons
  • The learning resources
  • The file of the week
  • The events
  • The technologies

In addition, index your favorite resources in your own folders and find your history of consultation.

Subscribe to the newsletter

Add to my playlists

Create a playlist

Receive our news by email

Every day, stay informed about digital learning in all its forms. Great ideas and resources. Take advantage, it's free!