Back at their October 4th hardware event, Google had announced that Google Lens would be coming to the Assistant in “the coming weeks.”
Update – November 21st, 2017: Google has officially announced that Google Lens is rolling out to Pixel/Pixel XL & Pixel 2/Pixel 2 XL users in Canada, US, UK, Australia, India, Singapore in the next few weeks. This post has been updated to reflect that.
Original Story as follows:
Google Lens first showed up in Google Photos which would allow users to use existing photos and Lens would try to visually analyze anything it found on in your picture.
With Google Lens in the Assistant, users have to just tap the new Lens button in the bottom right and the camera viewfinder will open up. From there, you’ll be able to use Lens to visually search and analyze items in the real world.
Whether it’s seeing reviews for a restaurant or figuring out a type of flower to putting in a Wi-Fi password using info from a router, you’ll be able to use Lens for all of that.
Google says that with the Assistant and Google Lens it can help you with the following:
- Text: Save information from business cards, follow URLs, call phone numbers and navigate to addresses.
- Landmarks: Explore a new city like a pro with your Assistant to help you recognize landmarks and learn about their history.
- Art, books, and movies: Learn more about a movie, from the trailer to reviews, right from the poster. Look up a book to see the rating and a short synopsis. Become a museum guru by quickly looking up an artist’s info and more. You can even add events, like the movie release date or gallery opening, to your calendar right from Google Lens.
- Barcodes: Quickly look up products by barcode, or scan QR codes, all with your Assistant.