AI's ability is based on the data it learned from, and the power of the hardware its algorithm is running on. The better the two, the better the AI should become.
Google Assistant is an AI–powered virtual assistant that is primarily available on mobile and smart home devices. Unlike the company's previous virtual assistant, Google Now, Google Assistant boasts the ability to engage with its users in two-way conversations.
And this time, Google is improving it a notch.
To make this happen, Google wants Assistant to understand its users better.
Google knows that Assistant needs to both recognize the words its users are saying, and also know what they mean. What's more, Google also knows that Assistant should also adapt to the way its users talk, and not requiring users to speak the right words in the right order.
This is difficult, since teaching AI to understand spoken language involves making computers understand contextual things. Spoken languages can vary so much from one person to the next, that even pronouncing names can be difficult.
Acknowledging that some names that are spelled the same are sometimes pronounced differently, the company realizes that the complexity is the very thing that makes understanding the way people speak so difficult.
"Names matter, and it’s frustrating when you’re trying to send a text or make a call and Google Assistant mispronounces or simply doesn’t recognize a contact. We want Assistant to accurately recognize and pronounce people’s names as often as possible, especially those that are less common," said Google in a blog post.
So how can it teach its AI to understand the different pronunciation?
To solve this, Google starts allowing users to teach Google Assistant to enunciate and recognize names of their contacts the way they pronounce them.
Here, Assistant will listen to the pronunciation and will remember it as it is, but without keeping a recording of its users' voice.
What this means, Assistant should be able to better understand its users better when they say those names, and also be able to pronounce them correctly.
According to Google, the company is making this feature available in English first, before expanding it to more languages.
The next update, is to create a better conversation by making Assistant better in understanding context.
"Assistant’s timers are a popular tool, and plenty of us set more than one of them at the same time," said Google.
"Maybe you’ve got a 10-minute timer for dinner going at the same time as another to remind the kids to start their homework in 20 minutes. You might fumble and stop mid sentence to correct how long the timer should be set for, or maybe you don’t use the exact same phrase to cancel it as you did to create it."
"Like in any conversation, context matters and Assistant needs to be flexible enough to understand what you're referring to when you ask for help."
In this case, Google said that it rebuilt Assistant's natural-language understanding (NLU) models so it can more accurately understand context while also improving its "reference resolution."
This update has been made available for alarms and timers on Google Assistant, initially on smart speakers in English in the U.S..
Most of the updates here are powered by the company’s BERT language model.
This technology helps bring a more natural, back-and-forth conversation with Google
BERT, which stands for "Bidirectional Encoder Representations from Transformers," is a Transformer-based machine learning technique for natural language processing (NLP) pre-training process.
When it was first created and published in 2018 by Jacob Devlin and his colleagues from Google, it achieved state-of-the-art performance on a number of natural language understanding tasks.
This happens because BERT is a bidirectional, unsupervised language representation, that is pre-trained using only a plain text corpus.
Most other NLP uses context-free models to generate a single word embedding representation for each word in the vocabulary. BERT on the other hand, is able to take into account the context for each occurrence of a given word.
This improves its ability vastly.
Previously, Google has been using BERT to make its search engine understand questions beyond keywords, among others. In 2020, Google said that BERT is already unleashed to almost every query in English.