Google Lens Multisearch – This is what the update brings with it!
11.04.2022
06 min
eologyeo:newsGoogle Lens Multisearch – This is what the update brings with it!
Last year MUM, this year Multisearch – Google is expanding its search functions and constantly improving its own algorithm. In addition to multimodal searches using the MUM algorithm, i.e. searches that draw information from more than one source or language, Google’s Multisearch now combines image and text searches. For search engine users, this means that in the future, a search on the web can take place via an image and then be strongly segmented or refined by a text entry. Here we tell you in detail what to expect!
What is Google Lens anyway?
The Google Lens function has been around since 2017. This is accessible via the Google app, which you can use via your smartphone. You can use it to visually analyze different aspects. After these analyses, Google will suggest suitable search results to you. You have two options to use Google Lens:
Use a photo, screenshot or image from your gallery
Using your camera
What features does Google Lens offer you?
You have the following functions when you use Google Lens:
Translate: Scanning of texts and subsequent translation
Search or shopping: finding matches to the content of your photos
Text: Copying text (passages) and pasting on the computer
Identification: Recognition of plants and animals
Homework: Scanning text and mathematical formulas to find explanations for them
Places: Finding places that match visually
Food: Getting more detailed information about dishes, ingredients, menus, etc.
Until now, Google Lens has been independent of web search – but Multisearch is now set to change that. By combining image and text searches, matching results are also combined. In contrast to what you are used to, you now have the option of using more than just text for your search queries. This is especially helpful when searching for products or items.
A special feature: Multisearch also allows you to search for specific patterns. For example, if you see a wallpaper somewhere whose pattern would look good on your sofa, you can take a picture of it. You can then refine your search using the keyword “sofa”, which you can add by swiping up and clicking the “+ Add to your search” button. Google’s Multisearch then applies the pattern it recognizes in the photo and combines it with your text search query to give you the best possible result. Similarly, Multisearch can be used, for example, if you already like a product very much but would like it in a different color. In this case, you specify which color you are looking for via the text search.
Conclusion: What the new Multisearch brings us
Even if the benefit for online shopping is clearly in the foreground, Multisearch has many other fields of application. These include, for example, care instructions for plants or obtaining background details for city tours or travel. The resulting potential is therefore enormous and extends to far more sectors than just e-commerce.
However, the whole thing has one disadvantage (yet): The new Multisearch is currently only available in English and only on the US market. It is not yet possible to say exactly when it will be available in Germany. Presumably, the beta test run has to be waited for before a worldwide rollout.
Lara
Meyer hat ihr Bachelorstudium der Betriebswirtschaft, Schwerpunkt Betriebswirtschaft
der Medien, an der FH in Würzburg abgeschlossen. Als Teil des
eology-Marketing-Teams kümmert sie sich um die Verbreitung des gesammelten
eology-Wissens, indem sie ihr Know-how in Magazinen, Blogs und Zeitschriften
teilt.
Google Helpful Content Update September 2023
The Google Helpful Content Update in September shows far-reaching effects on the visibility and ranking of websites in search results. The update aimed to improve the quality of search results for users by penalising websites with low-quality content and excessive advertising. Click in now and learn more. ... Continue reading
The response quality for SEOs has been further improved with the introduction of "Custom Instructions" on ChatGPT. Learn more about this technology here. ... Continue reading
Google is testing a brand new feature within its search engine that shows users whether an article is quick to read or not. The Quick Read Label appears directly under the description in the SERPs and identifies articles that have a reading time under 5 minutes. Learn more about it here. ... Continue reading