Google has started rolling out visual search feature Google Lens in Assistant for the first batch of Pixel and Pixel 2 smartphones. Google introduced Lens as a somewhat exclusive preview of the Pixel 2 lineup on October 4. However, they also said that the visual search feature would soon be pushed to more previous Pixel devices. Also, Lens was also promised for Assistant in the near future.
Running 8.1 beta on OG 5in Pixel, Google lens showed up inside of the Google assistant. No need to take a picture to use it, it popped up with it’s own viewfinder. https://i.imgur.com/rPakQgL.jpg https://i.imgur.com/mDbdjmD.png
Built into the Photos app, Google Lens can recognize things like addresses and books, among others. In Photos, the feature can be activated when viewing any image or screenshot. However, in Google Assistant, it is integrated right into the sheet that pops up after holding down on the home button.
“Lens was always intended for both Pixel 1 and 2 phones,” Google had earlier said in a statement. The app was announced by the tech giant during Google I/O 2017. It has been designed to bring up relevant information using visual analysis.
Pressing this new button in the bottom right corner opens up a camera viewfinder. Tapping anywhere freezes the view and begins a search with possible results and actions appearing as a carousel of suggestion chips at the bottom of the screen. There is also a rectangular overlay on the item in question with a close bottom in the top right corner.
Meanwhile, users can quickly start a voice search with the microphone while a new visual lookup is triggered by re-tapping the Lens icon in the bottom right.
This is not yet a global roll-out, but some users are seeing this feature pop up on their Pixel phones. We presume it should roll out to all users in a few weeks.