Google Lens, the smart function that analyses and identifies what your camera sees, was one of the most exciting features touted by Google in their I/O and Pixel event earlier this year. Although a limited version of Google Lens has already been present in Google Photos for Pixel users, now certain users have begun to find the visual search feature running integrated with the Assistant on their Pixel and Pixel 2 phones.
Google Lens in Assistant was first spotted by users on Friday evening, the first users have spotted the visual search feature up and running on their Pixel and Pixel 2 phones. As the rollout reaches all users, soon Pixel and Pixel 2 users will be able to use Google Lens with Assistant as well as with Photos.
In Photos, Lens can be activated when viewing any image or screenshot. The Lens can recognize and capture information such as phone numbers, addresses, URLs, etc. However, in Assistant, it is integrated into the new Assistant interface, triggered by holding down on the home button.
In Photos, Google Lens will recognize objects and provide you with information. / © ANDROIDPIT
The new button in the bottom right corner opens up a camera viewfinder. Tapping anywhere on the image freezes the view, outlines the item in question and Google Lens starts a search which offers up a range of possible results to identify the object, as well as suggested actions such as search the web, open other apps, and more. Naturally, you can also share the result along with a feedback rating (thumbs up or thumbs down) which should improve the effectiveness of Google Lens over time.
Google Lens could be the next big evolution of smartphone “intelligence”.
What do you think?
The new interface also allows users to quickly start a voice search with the microphone, and you can start a new visual query with Google Lens by re-tapping the Lens icon in the bottom right.
Although the update has yet to roll out to our own Pixel phones, it will certainly be welcomed by Google Pixel users of both generations, who will benefit from a machine learning assisted user experience unparalleled by other devices. In the very long term, this feature should roll out to all Android phones, but it will likely be a Pixel exclusive for a while.
Are you excited to use Google Lens in Assistant? Have you already received the update?