One particular advancement driven by machine learning is the ability for computers to understand natural language, with Google showcasing these improvements with Smart Reply. Its Research division has been exploring other applications and today releasing two fun and interesting demos.
Last year, Google was able to increase the percentage of Smart Reply’s usable suggestions by using hierarchical vector models of language:
Natural language understanding has evolved substantially in the past few years, in part due to the development of word vectors that enable algorithms to learn about the relationships between words, based on examples of actual language usage. These vector models map semantically similar phrases to nearby points based on equivalence, similarity or relatedness of ideas and language.
These improvements can drive new search experiences as Google is demoing with “Talk to Books.” Users can ask a question or make a statement and then the website will try to answer you by suggesting a relevant passage from a book. Google’s idea is that these responses could “help you determine if you’re interested in reading them or not.”
Once you ask your question (or make a statement), the tools searches all the sentences in over 100,000 books to find the ones that respond to your input based on semantic meaning at the sentence level; there are no predefined rules bounding the relationship between what you put in and the results you get.
Compared to Smart Reply and its paragraph level of analysis, this experiment looks only at sentences from the 100,000 analyzed books. As such, responses might be out of context, but Google argues that the upside is finding “unexpected authors and titles.”
Meanwhile, Semantris is a fun word association game that uses a Tetris-like format to demo the technology. The player enters a word or phrase which the game then ranks against words on the screen, scoring them to how well they respond and then eliminating it as part of the game.
The “Blocks” mode features a Tetris format that I’ve found quite fun, especially when you try to be opaque and see whether the natural language understanding can make even the most minor of connections.
Other practical applications include classification, semantic similarity, semantic clustering, whitelist applications (selecting the right response from many alternatives), and semantic search (Talk to Books). Google hopes that by sharing these demos, others will come up with more novel uses of these latest machine learning advancements.
Check out 9to5Google on YouTube for more news:
FTC: We use income earning auto affiliate links. More.
Comments