Have you ever wanted a question answered without seeing the exact words for your question? Google now has a solution for this: At the beginning of April, the group announced the introduction of the new Google Multisearch function, which allows users to search for text and images at the same time using Google Lens, the company’s new image recognition technology. The new Multisearch feature is currently available in the United States as a beta feature in English.
Table of Contents
What Is Google Multisearch?
Google Multisearch is the simultaneous use of images and text within a search query via Google Lens. Users can go beyond the search box and ask questions about what they see. Google sees the new function as a better chance of providing its users with relevant search results and helping them with a possible problem that occurs relatively frequently:
There is a visual component to what you are looking for, but it is difficult to describe in words alone.
So the idea behind Google Multisearch is to help users search more specifically for things that are difficult to describe. This could include the description of a specific shoe model or an unspecified type of plant for which you are looking for more information or similar items.
Beta users can access this new feature through Google Lens, located at the far right of the Google search bar in the app, on both iPhones and Android devices. All the background to Google Lens was revealed at last year’s Google developer conference and explained in more detail in our contribution to the Google I/O Keynote 2021. The Lens icon allows searchers to take a photo and, in combination, view similar products that Google finds on third-party sites.
So much for the short version of how Google Lens works. This is where Google Multisearch comes in, allowing users to refine their search queries in the future further. They can swipe up and use the “Add to your search” field. Any word can be added there that should be viewed in combination with the uploaded photo in the search query—more on that below.
Google Multisearch In Practice
In practice, the new Google Multisearch could work like this:
Google Multisearch When Shopping:
Let’s say you find a pair of shoes that you like, but you’re not a fan of the color they come in. You now can call up a photo of the shoes and then add the additional text “white” to your search query to find the item in the color you want.
Multi-Search For Setup Questions:
Google even goes one step further. Multisearch can also serve as a kind of furnishing advice. So if you’re looking for new furniture in the future and want to make sure it matches your current decor, you can take a picture of your living room and add the text “coffee table.” Google will then suggest suitable models for you.
Multi-Search Function For Plant Questions:
For all hobby botanists and especially those who want to become one and therefore do not yet have a green thumb, Google Multisearch will, in future, offer quick help for all plant matters:
Do you have a plant and don’t know what care it needs? Aside from the fact that you can now use Google Lens to find out the plant species with an image, you can directly add a picture of your plant and the additional text “care” to your search to learn more about it.
We are curious to see how the new function will be implemented in the beta version and if and when the multi-search part will also be rolled out.
Also Read: Top 5 SEO Tips For Implementing In 2022