There used to be this cool feature for our phones that we could bring up the assistant (whether with "Google Now" back in the day or just asking "Hey Google, what's on my screen"). The Assistant could look through the contents of your display and give you ideas. Was it a restaurant and then find you the address, selecting text etc...
Well, now when you pull up the Assistant, there's no "What's on my Screen" button, and you can't ask it that anymore. So how do you do it?
Google Lens to the rescue
Took me a while to figure it out... was way simpler than I thought.
Not always perfect I'll admit. But, it works more than you think it would. Sad that it doesn't always work on different QR codes.
That was my real hope. Sometimes I'll have someone share a QR code in Imgur or Facebook, but how do I scan it from my phone? I was hoping that I could just pull this Assistant up and have it use the Lens for me.
Sometimes that worked - sometimes not. Ah well. Better than not, or having to go and pull up your computer and then use your phone to scan that.Steps to do it:
now your mileage might vary compared to my Pixel, or steps slightly different
- Find the image you want to use to Google search with
- Swipe up and hold that 'up' position for a second to bring all the open tasks/cards
- Find the image you want to search with and long press on it
- sometimes it may not be recognized as an actual image... I've had this issue, I guess depends on the app - sometimes it just let me slect text within that picture, sometimes just the image
- a little pop box of options should appear and one should be 'Lens'
- Lens will do its magic to try to find out what it can.
- A list of its suggested findings will be the bottom half of the screen, which you can then scroll through
- There will be a button to the bottom left for specific features if you want to filter those results based on text
- There's also a button on the bottom right that will let you select the searching to a specific part of the image