Skip to main content

Review: Subtlety makes and breaks the Pixel 4’s new Google Assistant

Google Assistant is what unifies the company’s first-party hardware across different form factors. Back in May, the company teased a “next-generation Assistant” that’s faster and more capable. In actual usage, the most important changes with the Pixel 4’s new Google Assistant are a new subtle design that borders on invisible and seamless “Continued Conversation,” while the much-touted phone control fails to live up to its promise.

Assistant UI evolves and shrinks for the better

Regardless of how you open the Assistant, you know it’s new thanks to the absolutely delightful four Google colors that stream in from the sides of the phone before pooling together at the bottom. The resulting light bar is inherently Googley with a glow emanating from the strip as Assistant asks, “Hi, how can I help?” Beyond looking futuristic, this subtle “UI” at the bottom of the screen signals when Assistant is available to take a question and has colors shift as you’re speaking.

Of all the launch methods available now (there are many), I prefer the single button press in the Pixel Launcher’s search bar. It requires less effort than Android 10’s gesture navigation system, while Active Edge does not work for me because I’m a chronic tight squeezer of my phone. (As a complete aside, I don’t often pocket my devices. When I’m out in the world or even around the house, I much prefer to be holding it so that I can always see news the Ambient Display. As I’m also case-less, I derive my sense of phone security from squeezing it tightly whenever it’s not being used and just at my side.)

Instead of immediately opening a sheet that covers the bottom third of your screen, the Pixel 4’s new Assistant is more subtle. A transparent overlay that’s about 60% shorter transcribes your command, while there are shortcuts for the Assistant Updates feed and text input. Unfortunately, this design removes the ability to quickly launch Lens, but it’s altogether a much-improved aesthetic and user experience.

Similarly, responses do not take up the entire display. Answers to simple questions just take up the bottom third of the screen, while launching music is a half panel. The largest sheet I’ve regularly encountered — for the weather — takes up two-thirds of the screen.

This conciseness results in your current app or homescreen remaining visible in the background. You can tap anywhere above or outside Assistant to return to what you were doing before seeking help. With the old design, exiting Assistant was like closing any other app or required a swipe up on the gesture pill to go back to the homescreen.

This much-improved preservation of context and subtlety reveals how Google wants you to interact with Assistant. In the past, voice was an experience that took up the entire display and often brought you to a scrollable feed. The new Assistant on the Pixel 4 is now more akin to Google’s proclaimed ambient computing future where interacting with help should occur without a forethought of disrupting your life.

That said, the new Assistant could have been even more “out of the way” and UI-less. When the next-generation Assistant was first demoed at I/O, Google showed it off with 2-button navigation. The beauty of that deprecated navigation system is how your speech transcript could be displayed to the right of the pill. Assistant didn’t need a new transparent overlay; it could just take advantage of the navigation bar.

Faster Assistant, but not more Assistant

Another step forward is Pixel 4’s big technological achievement: shrinking the 100 GB models responsible for speech recognition and language understanding to just 0.5 GB. Running on-device, voice processing works offline and allows for local actions like turning on the flashlight or starting a timer without the need for the cloud.

Google at I/O touted a 10x speed increase in getting answers, but in usage I haven’t noticed a real drop in how long it takes to get most things done with Assistant. Then again, I’ve never had an issue with Assistant’s speed in day-to-day life.

Queries that still need to hear back from Google — which make up the bulk of requests — ultimately take about the same time to complete and is all down to your internet connection. Simple tasks like opening apps or taking a picture are speedy, but it’s unclear if l prefer this to just tapping away on the screen.

This speed does make possible the firing of back-to-back commands that Google expects people will do in the future for complex tests. Experiences would otherwise be glacially slow and impractical. While I’m willing to issue additional follow ups, the added speed does not make me invoke Google more times throughout the day.

The bigger usability change with the new Assistant is Continued Conversation. Similar to how the visual interface is now out of the way, not having to preface every command with “Hey Google” makes voice interaction so much more natural. Google will keep the microphone on for a few seconds in case you have anything else to ask. You quickly learn to exclude the hotword and it’s mighty futuristic. Oddly, I noticed that you have to manually turn on Continued Conversation for the Pixel 4 in Assistant settings.

The new Assistant tricks are limited in scope

Being able to control the Pixel 4 with your voice is perhaps the biggest feature add that Google is advertising for the new Assistant, and there are a couple examples of this that Google is touting specifically. The first involves asking for a set of pictures with just your voice, tapping to select one, and then having Assistant send directly to a contact. This specific interaction — which to the company’s credit involves many steps — works very well.

That said, this mix of voice and touch interaction seems to be an inefficient way to send a picture, link, or video once you’re well-versed with Android share sheets, Google apps, and Messages. It’s definitely useful for owners new to Android and Google services — and harkens to the popularity of just demanding tasks from Smart Displays and speakers, but they’ll eventually pick up how to quickly navigate.

Beyond directly controlling specific apps like Photos, Google touts the new Assistant as aiding general multitasking. For example, if you’re texted about your travel details, you can ask, “When does my flight land?” the answer to which Assistant is already aware of because of what’s in your Gmail inbox. The new Assistant displays a panel with the information, and you can say things like “Reply I’m taking off at 10:30” and the Assistant will know to insert that text in the “Reply” box in messages.

These are nice and all, but inspired by what Google showed on stage earlier this year, I had the expectation that the new Assistant today would be capable of doing a greater number of more complex tasks. One of the first things I tried doing was navigating the web with my voice, which is only a partial experience for now.

On the Pixel 4, “Visit 9to5google.com” opens in Chrome, while the Pixel 3 launched a Chrome Custom Tab that originated from the Google app, thus being removed from my other open tabs. This is an improvement, but I could not select articles to open by voice or scroll down — though Assistant did advise me to download the Voice Access accessibility app.

As seen with Google Photos and Chrome (to a limited degree), some apps are more closely integrated with the new Assistant than others. For example, when you ask for locations in Maps or videos in YouTube, commands will be interpreted as if you typed directly into the in-app search fields.

Pixel 4 new Assistant

Meanwhile, sending a simple email is possible through the Assistant UI today, but the ability to compose directly in Gmail — as shown at I/O — is not yet available. Speaking of email, there is still a bug where you cannot have a G Suite account on your device to use the new experience.

The new Assistant is evolutionary, not revolutionary

Many Google products — and visions about AI — start out as promises. The new Assistant fully accomplishes the specific use cases that Google is advertising to buyers. That said, the real promise is how many more complex, but everyday phone-controlling workflows can be handled by voice in the future.

With that control off in the distance — albeit closer than ever before, the real thing that Google brings to the table with new Assistant is an unobtrusive design that is subtle today and signals an invisible future. Until voice becomes a more powerful control, the new Google Assistant should not be the sole reason you buy the Pixel 4 or Pixel 4 XL.

More on Pixel 4:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel