At this year’s Google Search On event, the company announced many major improvements to Google Lens. These improvements apply to iOS, Android, and even the desktop. Google Lens, an image recognition service that uses Google’s AI, was introduced to Android smartphones in 2017 as the spiritual successor to Google Goggles. However, until this year, it was exclusively sold on mobile phones. Google Lens also has recently updated materials. All three of these improvements are significant changes to the platform. Especially because Google is increasingly introducing AI-based extensions.
Google lens improvements
Multitasking integrated model (MUM) enhancements
This year, Google announced the Multitask Unified Model (MUM) at I / O. According to Google, MUM can understand various formats of information at the same time, such as text, images and videos. You can also draw insights from concepts, topics and ideas about the world around us and identify the connections between them. The company shared an example of choosing a shirt pattern on Google search and trying to find the same pattern on socks. This feature will be released on Google Lens in the coming months.
Google lens on desktop
A few months ago, when I right-clicked on the image, I found Lens on Desktop. It was the server-side switch that needed to be enabled, and not everyone had it. Currently, Google is rolling out this feature to everyone over the next few months. Search for images, videos, and text content on your website with the lens and quickly see the search results in the same tab.
iOS lens mode
All images in the Google app on iOS can now also be searched on the lens. Google calls this “lens mode”, and you can search for images that you can shop on the website while browsing with the Google app on iOS. This feature is currently limited to the United States and no global release has been announced yet.