At I/O 2018, Google announced a new “visual experience” for Assistant on phones that is quite reminiscent of the original vision for Google Now. Featuring proactive suggestions and personalized information for your day, this redesign is rolling out to phones this week.
Google describes this new capability of Assistant as a “visual snapshot” that serves information based on context like the time of day, location, and recent interactions. Kicking off this new look is a slight tweak to the Google Assistant panel that slides up when you press and hold on the home button.
The Assistant icon is smaller and located in the top-left corner while responses drop the speech bubble motif. Meanwhile, suggestion chips with various actions are now flat pills, much like we spotted on the Google app search bar yesterday.
To access the new Assistant experience, there is a new icon — the same as “Upcoming” — in the top-right next to Explore. Expanding to take up the entire screen, Google Assistant addresses you by name with those greetings changing depending on the time of day. Immediately underneath is the current weather condition and temperature.
This page updates and rearranges throughout the day, but cards you can expect to see include your Calendar appointments, reminders, travel times with the ability to immediately open Google Maps for navigation, and a carousel of various first and third-parts Assistant Actions like the weather, contacts, and listening to Spotify.
Other sections include upcoming trips, bills, orders from Gmail, reservations, and stocks. Towards the button is an “Explore using the Assistant” section and a list of Actions “You might like.”
Also available on iOS, this is the default view in the Google Assistant app. Google will be adding more sections like an overview of notes and lists from Keep and other third-party apps, parking reminders, personalized music and podcast recommendations, and nearby activities. It will begin rolling out this week to phones.
9to5Google’s Take
Since its debut on a smart speaker, Google Assistant has primarily been an audio first experience. When it arrived on phones, it added a number of accommodations for a screen, but the primary method of interaction seemed to prioritize verbally asking questions and getting back audible replies. This began to change with Lens and Google is now fully embracing a visual design.
Coinciding with the upcoming launch of Assistant smart displays, this new “visual snapshot” provides a more efficient way for Google to be “smart” and show users what it knows. It is more convenient in more settings than an audio rundown and thus more likely to be used.
Check out 9to5Google on YouTube for more news:
FTC: We use income earning auto affiliate links. More.
Comments