Back at Google I / O 2021, the search giant detailed the ‘Unified Multitasking Model’ or MUM. The company called it a breakthrough in artificial intelligence that was much better at understanding language.
Google primarily spoke of MUM as a way to improve responses to search queries by better understanding difficult questions. Now, Google is bringing those improvements to another search product: Lens.
One of the main benefits of MUM is that it can understand information in a variety of formats, such as text, images, and video. By integrating MUM into Lens, Google says it is opening up a new form of search by allowing users to combine visual and text-based queries for better results.
The company shared some examples of how it could work. One example was shopping for clothes: if the customer sees a pattern that they like on a skirt but wants the same pattern on their socks, they can use Lens to search for it. Google says the feature will launch in Lens “in the coming months.”
New search experiences focus on broadening themes and visual results
MUM, our advanced AI model, is coming to #GoogleLens early next year. You will be able to take a photo AND ask a question, which can be useful in those moments when you need to fix a broken part and have no idea what it is 🤷🔧 #Search in pic.twitter.com/cmedce3dB2
– Google Google) September 29, 2021
Next, the company detailed a redesigned search experience that hits Google Search. Three new components come as part of this redesign.
First is “Things You Should Know”, which will offer extended search suggestions based on general topics. For example, searching for something like “acrylic paint” can reveal other “deeper insights” on the subject, such as “how to make acrylic paintings out of household items.” Google introduced it as a way for users to dig deeper into search topics.
This is followed by “Refine this search” and “Extend this search.” Working as two sides of the same
currency Characterizing, refining, and expanding are another way to help users explore a topic. For example, if users are searching for a really broad topic, ‘Refine’ may suggest more limited searches to help users get closer to the specific. At the same time, ‘Zoom in’ suggests searches moving in the opposite direction – if someone is looking for a narrow query, Zoom in can help them zoom out to get a bigger picture.
Additionally, Google says it will offer a more visual search results page soon. The new results page will bring together various types of results for users; for example, you can combine text, image, and video results in one place. The new search results page won’t show for everything, but users will start to see it when they search for visual queries.
Things to know will be released in the coming months. So will Refine and Broaden, but it will be limited to English. Visual results will be released first in the US for English users.
Finally, Google plans to introduce MUM to video with a new experience that identifies related topics in a video. Google says this works even if a topic is not explicitly mentioned in a video. The feature could be a way to help people dig deeper into video topics.
This will begin rolling out on September 29 for English users in the US.