Google Lens is preparing a new update. find anything Know where to sell and add My Ad Center to control ad content manually.

Just a little more Google I/O lost. Google just released the Multisearch feature to Google Lens 5 weeks ago. It’s a complex search feature with images and text. Recently, more ability upgrades were announced, becoming Multisearch near me Ready to add new features Scene exploration Find insights on multiple objects at once.

Multisearch near me – Search for anything Plus you know which stores have them for sale.

As previously presented, Multisearch is an image search through Google Lens, where users can type in extra text later. to increase the search conditions, for example, take a photo of a shirt We might like this shirt. but I want another color Just type “green” and the system will search for the same shirt. or that are similar in nature but green come show us

Whereas Multisearch near me increases the results of nearby shops or restaurants. For example, continue from above. When we find the green shirt that we want and find it, type “near me” into Google, then go to pull the data from Google Maps to show you that. Which shop in our neighborhood has it for sale? Definitely like shoppers and foodies.

Scene exploration – Search through the camera with real-time conditions

Originally, each Google Lens search was limited to objects within the same frame. But after upgrading to Scene exploration, now we can use the Multisearch feature to create search conditions while panning the camera, which of course… it works in real time. Bring up the search results to see right now, live (so brutal, people at the event got up and clapped their hands furiously).

An example of the use that Google presents to watch at the event are: A friend asked us to buy chocolate in the supermarket. with the condition that Must be dark chocolate without nuts as an ingredient and delicious (based on product review scores in the database), total of 3 conditions

Once you’ve entered all the terms and conditions, enter Google Lens. We just pan through a shelf lined with different brands of chocolate. Anything that meets the conditions will be highlighted on the image, something like this. This feature is a showcase of how smart Google’s AI is. It’s almost like Chrome’s Ctrl + F function in the real world.

Another interesting point is that Google said last year that Google Lens search traffic grew 3 times, with an average of 8 billion views each month around the world.

My Ad Center – You can control the ad content yourself.

We all know that Google parent Alphabet’s main revenue comes from its advertising business. accounted for more than 80% of total income. which requires data collection and usage behavior of customers until there are often jokes that Google may know who we are. Better than we know ourselves.” (ha)

However, Google shows sincerity, saying that even though we live with ads, it’s true, but it’s transparent, verifiable and never sells the information to other companies, and what Google is going to do next is My Ad Center. Users can control ad content by themselves.

My Ad Center divides the things that interest us into categories. What content do we dislike? or don’t want it to appear in your feed, such as gambling or pornographic content. Just turn it off from here. It applies to YouTube, Search, Discover, and platforms that use Google advertising services, or vice versa. What content do we particularly like? If you want it to appear in your feed more often, you can set it yourself as well.

Multisearch near me initially supports only English, while other languages ​​will follow in the future. As for the launch schedule, Google is broadly stated at the end of this year, as is the case with My Ad Center.

Source: Google I/O