Menu
Share
Share this story
Close X
 
Switch to Desktop Site

Google secretly rolls out conversational search about nearby places

Google can now answer “how tall is this” or “when does this restaurant open.” Is this natural language, location-aware search feature a herald of ubiquitous computing?

View video

David Singleton, director of Google's Android Wear, speaks during the Google I/O keynote presentation in San Francisco on May 28. 2015.

Jeff Chiu/AP Photo (FILE)

View photo

At Google’s annual I/O developers conference held May 28, the search engine giant unveiled new software features, as well as improvements to the Android operating system. Much of the event focused on new context-aware improvements to Google Now, Android’s digital personal assistant, but one of the software’s biggest features was actually rolled out secretly.

This Tuesday, a representative for Google at the Search Marketing Expo in Paris premiered Google Now’s location-aware search feature. The tool, which has already rolled out to most Android devices, was captured on video by Search Engine Land’s Danny Sullivan and posted to Twitter.

About these ads

The videos show users finding information without having to search for specific terms. Instead of having to open up Google Maps or Search for your location, users can call for information from Google Now by asking “how long is this river” or “how tall is this” or “when does this restaurant open.” Google made a point to use its voice-recognition software in the demo.

Recommended:What is the Internet of Things?

These features are part of a growing trend by tech companies to not only get people to speak naturally to their computers, but also to move toward “pervasive computing.” This type of context-aware computing bridges multiple sources such as location, experiences, and who you’re with to convey relevant information to users.

Google is not alone in its search for easy computing. Apple, Microsoft, and academic centers have been doing research on the future of pervasive technology for years. At Apple’s Worldwide Developers Conference (WWDC) Monday, Apple unveiled a contextual intelligence feature for iOS 9 that assists the user with relevant information based on location, time of day, and activity. For example, if your iPhone senses that you might be going for a run, it will suggest your preferred playlist when you plug in your headphones.

Wired’s David Pierce notes that these various consumer features are all starting to blend together. But that’s not a bad thing.

“[There’s] real power in the notion that you know what your device can do, and how to use it, no matter what device it is,” writes Mr. Pierce. “The mobile industry briefly looked like a winner-take-all fight to the death, but it has instead become essentially a single shared laboratory of ideas.”

“The industry has collectively come to one idea about how we’ll use technology,” he notes.

Recommended:From Grace Hopper to Ada Lovelace: women who revolutionized computer science

Much of this can be traced back to computer scientist Mark Weiser’s concept of “ubiquitous computing.” Weiser served as chief scientist at Xerox’s Palo Alto Research Center. His 1991 article for Scientific American says, “The most profound technologies are those that disappear.” According to Dr. Weiser, we don’t want to notice technology, but in all of our uses, we should not be without it. It should create calm. Computing should be pervasive throughout all that we do, and integrate seamlessly with how we do it.

About these ads

Google Now, Apple’s Siri, and Microsoft’s Cortana all fit within the stylebook for seamless operation. As we move toward a fully-functional Internet of Things, the goal of technology companies has been to remove the idea of “technology.” Whether it’s through natural-language processing or intelligent suggestion tools, our search to fully integrate products into our lives has become more (er... less) noticeable.


Follow Stories Like This
Get the Monitor stories you care about delivered to your inbox.