When Google announced (and later began rolling out) conversational search back in May, the company saw that as only the start. The company’s plans for the feature take us all the way into the realms of a true virtual personal assistant.
If you haven’t yet tried conversational search in Chrome, the feature as it stands is useful but basic. Speak a search like “How old is Barack Obama?” and Chrome will speak the answer. With a person, you could then ask a series of follow-up questions like “How tall is he?”, “Who is his wife?” and “How old is she?” and they would know who you were referring to in each question. That’s the functionality Google is rolling out, remembering who or what you just asked about and interpreting pronouns appropriately.
But Google’s long-term plans are far more ambitious. In an interview with TechFlash, Google Research Fellow Jeff Dean talked to Jon Xavier about his team’s work on machine learning and neural nets to expand Google’s abilities in conversational search …
Dean says his team is working on what he calls “big problems”: essentially, having a machine react as flexibly and intuitively as a person. He gives the example of the request “Book me a flight to DC.”
That’s a very high-level set of instructions. And if you’re a human, you’d ask me a bunch of follow-up questions, “What hotel do you want to stay at?” “Do you mind a layover?” – that sort of thing. I don’t think we have a good idea of how to break it down into a set of follow-up questions to make a manageable process for a computer to solve that problem.
Dean also sees a big future for integrating Google Now’s situational awareness with conversational search.
Like, if it’s trying to give me restaurant reviews, there’s probably 50 possible restaurants to choose. And they might all be pretty good suggestions because it knows what sorts of food I like, but it’s still a list of 50 restaurants. Again, this would be a place where a dialog would be useful. “Are you in the mood for Italian?” Something like that.
Google sees today’s searches as essentially still based on pretty crude keyword matching. The real future of intelligent search, the company believes, is in enabling machines to achieve a meaningful ‘understanding’ of questions. Once you have that, suggests Dean, you reach a whole new level of what a search-engine can do.
Queries where you have to join together data from lots of different sources. Like “What’s the Google engineering office with the highest average temperature?” There’s no webpage that has that data on it. But if you know a page that has all the Google offices on it, and you know how to find historical temperature data, you can answer that question. But making the leap to being able to manipulate that data to answer the question depends fundamentally on actually understanding what the data is.
This is the point at which we reach the level of a genuine virtual PA: an assistant bright enough to understand the requirement, ask intelligent questions to clarify as required and query multiple sources to obtain the info.
Couple that to interconnected personal devices and data, and you have what is to me the holy grail: the time when I can ask my phone to arrange lunch with you next week and it does it all intelligently.
It knows my diary. We’re friends, so my phone has permission to view your diary at a free/busy slot level. It knows where we each will be, to find mutually convenient locations. It knows what we each like to eat, and where we have eaten recently. It knows if a place we used to like has received a bunch of poor reviews lately. And it can connect to the restaurant booking system to grab the window table on the west side it knows I favour. Within a couple of minutes, it tells me it’s arranged lunch for us Tuesday at 1pm at the Rasa Sayang.
Mr Dean, I hope you’re working long hours.