Upgrades to Google’s search engine will make it better at understanding conversational queries – helping its mobile search apps tread on Siri’s toes.
By Tom Simonite on September 26, 2013
Google announced a series of upgrades to its search engine and mobile search apps today that strengthen its ability to understand queries in the form of natural sentences like those used in conversation. The changes are particularly focused on enabling more complex spoken interactions with Google’s mobile apps, boosting the company’s challenge to Apple’s Siri personal assistant.
“We are making your conversation with Google more natural,” said Amit Singhal, who leads search technology at Google. He spoke at a press conference held in the Menlo Park garage that Google cofounders Larry Page and Sergey Brin made their first office space in 2000.
The new features apply to all Google searches, but were all demonstrated with queries spoken out loud to Google’s mobile apps. One change sees Google better able to understand broad questions about categories of concepts. For example, saying “tell me about Impressionist artists” to the Google search app on a mobile or tablet calls up a page that presents many ways to explore the topic. A carousel of images at the top of the page allows a person to swipe through different artists, and tapping one leads to another summary page with a carousel of works from that artist. Asking Google about a band brings up a list of their songs to hear. Movies and many other topics can be explored in the same way.
Another upgrade gives Google the ability to compare different things or concepts. For example, asking the search app to “compare coconut oil versus olive oil” produces a table contrasting their nutritional qualities. Google selects the most relevant criteria to compare things. Asking for a comparison of two celestial bodies would see it use properties such as brightness, age, weight, and orbital period, for example.
Google’s new features rest on a system called Knowledge Graph, which the company unveiled last year. It gives the company’s software the ability to understand the meaning, concepts, and relationships behind text mentioning concepts and things (see “Google’s New Brain Could Have a Big Impact”).
Tamar Yehoshua, vice president for search at Google, also demonstrated an upgraded version of Google’s search app for Apple devices. “We have made voice a much bigger feature,” she said. The changes puts it into even more direct competition with Siri, which is promoted as a personal assistant people can talk to like a real person.
One new feature of the upgraded iOS app makes it possible to ask the app to remind you of something when you get to a specific location. If you tell it to “Remind me to get crackers when I go to Safeway,” the app will confirm which store you mean, and then notify you the next time you visit that location.
Singhal also announced that roughly one month ago, his team had made a complete overhaul of Google’s core search ranking system to improve its ability to handle longer, more conversational queries. The upgraded system is known as Hummingbird, and replaces one known as Caffeine used since 2010. About 90 percent of Google searches have been affected by the change.
“People have started asking many more complex questions of Google, and our algorithm had to go through some fundamental rethinking,” said Singhal. The changes were focused on improving Google’s ability to understand the concepts a person refers to in a query and how they are related, he said. “You have to balance all that meaning of what the query is looking for with what the Web document is saying.”