Google AI Surprising Move: Adding AI Linguistic Skills Into Robots

The technology giant Google has now taken a new step in the future of robots with artificial intelligence. And it started with this by adding AI language skills to the company’s day-to-day help robots so they could understand people.

Google’s research scientist covered this attempt in the latest installment of his Research Bytes, so let’s take a quick look at it below.

Googlebot can now serve you chips with native language command


Googlebot can now serve them chips with native CommandNow, many companies have robots that can perform simple tasks like getting drinks and cleaning surfaces. Google’s parent company Alphabet is the one of them, which has also been building them in recent years.

And since these bots are only capable of responding to simple instructions, now, with updates to the scientists’ AI language, they may be working a little smarter than before.

You will understand the consequences of spoken sentences, for example, if you receive expressions like “I spilled my drink, can you help me?” he goes to the kitchen to get the sponge.

Instead of apologizing, it will give you a better answer and determine the possible actions for the commands. This update may seem insignificant, but it is the start of something that has come before.

For example, in the future, we might see them take commands directly from the response, like “Ohh! My Coke can get away,” and start working on the possible action.

Google’s research team has called this approach PaLM-SayCan and they say that bots can correctly respond to 101 user instructions 84 percent of the time.

And 74 percent of the time it would execute successfully as instructed, and still in Google’s robot lab, research scientists are working to improve understanding with greater precision.

By the way, if you want to know more about it, you can also check the official Google PDF.

Add Comment