Google Developer Days has been taking place in Europe this week, and although the event is now over, Google recently uploaded a video from day 2 highlighting a lot of new features coming to the Google Assistant. Some of these are already available while others aren’t, but everything here is quite impressive.

Starting off with one of the biggest additions to the Google Assistant, a new translator mode was shown off. By saying “Ok, Google, be my [Instert Language] translator”, the Assistant will then repeat everything you say in the language that you asked for. For example, after asking Google to be your Spanish translator, anything you say to it will then be repeated audibly and visually in Spanish until you tell the Assistant to stop. Translation isn’t new to the Google Assistant, but this implementation of it is much more natural and should prove to be quite helpful when traveling abroad.

In addition to this, Google also showcased improvements to the Assistant being able to better understand the context of questions that it’s asked. In one instance, Behshad (the Googler presenting these new features) asks the Assistant to show him pictures of Thomas. Not having any prior context to this question, the Assistant pulls up pictures of Thomas the Train.

Behshad then asks for Bayern Munich’s soccer team, and one of the team members that’s listed here is Thomas Müller. Behshad asks once again for “pictures of Thomas”, and this time the Assistant shows pictures of Müller — referencing the data found in the previous answer and question.

In another instance, we see a demo of something that all of us have run across at one point or another — trying to remember the name of a movie. Here, Behshad asks the Assistant “What is the name of the movie where Tom Cruise acts in it and he plays pool and while he plays pool  he dances.” After thinking for a couple seconds, Google Assistant pulls up a result for The Color of Money and then reads off information about it.

Along with these more substantial additions, Google Assistant can now answer questions faster, can understand voice recognition more accurately in noisy environments, and can better-leverage Google Search when answering certain questions.

Following the improvements to the Google Assistant, a demo of Google Lens was also presented. This demo didn’t reveal anything new about Lens since Google last talked about it at I/O in June, but it’s still quite impressive. Here, we can see Google Lens being able to look at an apple and know how many calories are in it, as well as well as being able to look at three Polish Zloty bills and say how much it is in Swiss Franc. That might not sound all that impressive to read about it, so check out the video above to get a better idea for just what we’re talking about here — it’s seriously impressive stuff.

There’s no exact word as to when all of the upgrades to Google Assistant will be rolling out, but our guess is that they’ll go live either on or around the launch of the Pixel 2.

Check out 9to5Google on YouTube for more news:

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news:

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Joe Maring

Joe has been a writer and occasional video producer for 9to5Google since July 2017. Follow him on Twitter @JoeMaring1 and send all emails to

Joe Maring's favorite gear