Added to the Google Search Engine, the multisearch option allows you to use images and text to search for things you don’t know exactly how to describe. For example, you can use a picture of food to find the dish in the offer of a nearby restaurant, if you’re lucky.
It often happens that you search for a person or object without knowing its exact name. In this case, simply starting a search by image will often return only generic matches with other similar pictures in appearance, but still irrelevant to what you are looking for. Similarly, if you do not know exactly what you are looking for, the use of phrases and keywords will also return generic results.
Google addresses this limitation by using sophisticated artificial intelligence technologies that are able to combine search-based landmarks with more abstract information from user-uploaded photos.
Well-developed, the new Google Multisearch feature can be much more powerful than a simple keyword-based search, with artificial intelligence technologies helping to determine the context in which the search was launched, and more difficult-to-interpret landmarks such as the meaning of phrases. used in search and images added as a clue to the desired information.
Starting with the apparel shopping scenario, Google is expanding its multisearch feature to target any product in the offerings of nearby stores or businesses, as long as their activity is visible in the online space. For example, you will be able to start from a simple picture of the street with the location, or a picture of a delicious pizza ordered by an acquaintance, discovering with the help of Google Search exactly the pizzeria you would like to reach.
Finally, the search option could be selected by simply ticking a “near me” parameter, helping users who want to search for information about nearby restaurants and shops, using generic landmarks such as a picture of an item of clothing, with or without an exact description.
Google says the new multisearch option works with a vast database of millions of pictures and user-added reviews for different locations visited. Thus, any picture of the food ordered at the restaurant can become a landmark for potential customers, who happen to be looking for something similar.
In the first stage, local multisearch searches will be available for the English version of the Google Search engine in the second half of this year. For the best experience, Google recommends using the Google app for Android and iOS, installed on state-of-the-art smartphones, and using the “Lens” symbol to capture and upload images directly from your phone. Judging by the very strict hardware requirements, we can assume that part of the processing of search queries is done with the help of the AI ​​co-processor provided right on the users’ devices.