Google is the most powerful search engine in the world, but it still has limitations. Sometimes, there are things that you want to search up, but you can’t quite describe them in the text box. Thankfully, Google has a new multisearch feature for Lens that will help you with that issue.
Say, you see something in the world that you can’t quite describe in words like an abstract painting, and you want to find a shirt with a similar pattern. You can’t quite type that into the search, so what do you do? Well, you can harness the power of Google Lens to help you out. Google teased a new and extremely useful feature called Lens Multisearch last year, and it will help you with a dilemma such as that.
How will this Google Lens Multisearch feature help?
So, instead of typing in “shirts with abstract art on them” in Google, you can simply take a picture of that painting using Google Lens. Then, using the multiseach feature, type in “shirt”. From there, Google will use its AI wizardry to find shirts with similar patterns on them for sale.
The same can go for finding other items with similar designs, but the feature goes deeper than that. Basically, the feature is taking the image data and connecting it with the word(s) you typed in to find the best matches. It’s like putting 2 and 2 together.
Let’s say, you broke your clock, and you want to know how to fix that model. Take a picture of the clock and type “how to fix”. With that, you should get tutorials on how to fix that model of clock (if they exist, obviously). An example explained by The Verge is if you take a picture of someone’s nails, and type in “tutorial”, you should find tutorials on how to make similar designs.
There will be limits, however
Now, something like the Lens multisearch feature can be overestimated, easily. There are plenty of scenarios where this feature can come in handy, but you’re bound to search for something that doesn’t exist. If you take a picture of the Mona Lisa, don’t expect to find too many dresses with that design.
Right now, we don’t know just when this feature is going to roll out, but it shouldn’t be too long. The company teased it about seven months ago, and it’s testing it right now. According to the source, this feature is in the beta stage right now. When it launches, it will be for Android and iOS.