Google's AI model is introduced to us and will add support for more in-depth research, comparative shopping, etc.

The company launched Google AI mode to everyone in the United States starting this week, a feature that allows users to ask complex multi-part questions through the AI ​​interface.

The feature is based on Google's existing AI-driven search experience AI overview, which displays a summary generated by AI at the top of its search results page. The AI ​​Overview was launched last year, as Google's AI provides dubious answers and suggestions, such as using glue on pizza, etc.

Image source:Google

However, Google claims that AI overview is a success in adoption, if not accuracy, as more than 1.5 billion users use AI capabilities every month. Will now exit the lab. The company said it expanded to more than 200 countries and regions and is available in more than 40 languages.

At the same time, the AI ​​model allows users to ask complex questions and ask for follow-up actions. The feature was initially tested in Google's Search Lab, and with other AI companies such as Perpolxity and OpenAI, it has its own web search capabilities. The AI ​​model fears that search market share may be closely related to competitors, which represents Google's search future.

Image source:Google

With the AI ​​model rolling out more widely, Google is touting some of its new features, including in-depth searches. While the AI ​​model asks questions and breaks them down into different subtopics to answer your queries, in-depth searches do happen on a large scale. It can issue dozens or even hundreds of queries to provide your answers, which also includes links so you can research on it yourself.

Image source:Google

The result is a fully cited report generated in minutes, Google says, potentially saving you hours of research.

The company recommends using in-depth search features for comparison shopping and things like that, whether it’s for large household appliances or summer camps for kids.

Image source:Google

Another AI-powered shopping feature that enters AI mode is Apparel's virtual "try" option, which uses the images you upload to generate images of the item you wear. The feature will learn about 3D shapes and fabric types and stretch, Google annotations, and will be launched in the Search Lab starting today.

In the coming months, Google said it will provide shopping tools for U.S. users that will buy items on your behalf after reaching a specific price. (However, you still have to click Buy for me to start this agent.)

Both the AI ​​Overview and the AI ​​Mode will now use a custom version of Gemini 2.5, and Google says the capabilities of the AI ​​Mode will gradually be rolled out to the AI ​​Overview over time.

The AI ​​model will also support the use of complex data in sports and financial queries, which was acquired through the lab at some point "soon". This allows users to ask complex questions - such as "Compare Phyllis' home game against the White Sox have won percentages over the past five seasons." AI will search across multiple sources, put data in one answer, and even create visualizations at any time to help you understand the data better.

Image source:Google

Another feature takes advantage of Mariner Project Mariner, a Google proxy project that can interact with the web to take action on your behalf. Originally available for inquiries involving restaurants, events and other local services, the AI ​​model will save you some time studying the prices and availability of multiple websites to find the best options, such as affordable concert tickets.

Live search, which will be launched later this summer, will ask questions based on what your phone sees in real time. This goes beyond Google's lenses' visual search capabilities, as you can use video and audio for interactive back and forth conversations with AI, similar to Google's multi-mode AI system Project Astra.

Image source:Google

Search results will also be personalized based on your past searches if you choose to connect to the Google app using features launched this summer. For example, if you connect to Gmail, Google can learn about your travel date from the booking confirmation email and use that date to recommend the city events you are visiting so you can do it there. (Google notes that you can connect or disconnect your app at any time.)

The company noted that Gmail was the first app to be supported by a personalized environment.