Background

Google Expands 'AI Mode': Another Era Of Search By Asking It Anything

Ask Anything

For decades, Google reigned supreme over the internet, confidently dominating the digital landscape.

Since its founding in 1998, the company revolutionized how people accessed information—replacing outdated web directories with cutting-edge algorithms that delivered fast, accurate, and indispensable search results. This breakthrough propelled Google into becoming one of the most powerful and wealthy tech titans in history.

But that dominance was shaken with the arrival of OpenAI’s ChatGPT—a conversational AI that redefined how users interact with information.

What began as a novel experiment quickly revealed its immense potential. Google reportedly sounded a "code red", seeing ChatGPT not merely as a rival, but as a looming threat to its core search business.

The innovation that sparked curiosity soon ignited a full-scale technological rivalry—a race between Google, the long-established digital giant, and OpenAI, a bold challenger intent on reshaping how people search, learn, and engage with the web.

At the center of Google’s counteroffensive is AI Mode—a clear declaration of its ambition to remain the de facto leader in the search engine realm.

After embedding AI Mode to its search engine, this time, Google is expanding the tool to more users.

In a blog post, Google said that:

"As we’ve rolled out AI Overviews, we’ve heard from power users who want an end-to-end AI Search experience. So earlier this year we began testing AI Mode in Search in Labs, and starting today we’re rolling out AI Mode in the U.S. — no Labs sign-up required. AI Mode is our most powerful AI search, with more advanced reasoning and multimodality, and the ability to go deeper through follow-up questions and helpful links to the web. Over the coming weeks, you’ll see a new tab for AI Mode appear in Search and in the search bar in the Google app."

Announced during its developer I/O conference, the search giant laid out its vision for the future of searching the web.

The aim is to shift Google’s ubiquitous search engine from being a box for processing keywords to a system of "digital agents" that can crawl the web and answer questions based on a person’s real-world surroundings, tastes and preferences.

Unlike AI Overviews, which simply display AI-generated summaries at the top of search results, AI Mode goes a step further.

What sets AI Mode apart from traditional Google Search search engine is the way it handles user queries. Instead of processing the query as a single block, it dissects it into smaller, relevant subtopics and initiates additional searches based on those to craft a more tailored response.

Google has also revealed that AI Mode will soon incorporate a user’s search history to personalize results further, with integration into other Google services like Gmail on the horizon.

AI Mode, previously limited to users enrolled in its Labs program for early feature testing, is now available to all users in the U.S. via the Google app.

In addition to its refined query handling, AI Mode introduces two new interaction methods.

The first one enables Google to carry out tasks on behalf of the user, while the second leverages the phone’s camera to allow users to visually show Google their surroundings.

One of the driving technologies behind this capability is Project Mariner—a research prototype announced last year. Google says it enables the assistant to complete multi-step tasks and deliver results autonomously. For instance, a user could request, “Find two reasonably priced tickets for this Sunday’s Reds game in the lower section,” and the AI will scan ticket platforms, compare prices, fill out forms, and display the best options.

Though AI Mode is now generally accessible in the U.S., these two advanced features still require opting into Labs.

Initially, this will work for purchasing tickets, making dining reservations, and scheduling local services via partners like Ticketmaster, StubHub, Resy, and Vagaro. These features are set to roll out to the Labs section of the Google app in the coming months.

It's worth noting that while AI Mode is the Google app can already use AI to process information it gathers from the camera, thanks to the company's Lens technology, Google is stepping further by making the AI capable of processing information in real time.

The idea is to make it easier for Google to answer questions about complex tasks that are difficult to describe – such as whether the specific bolt in the toolbox is the right size for the bike frame being fixed – just by pointing a phone at it and asking.

The moves underscore that Google’s most important business is facing more competition than ever.

Chatbots like ChatGPT and AI-fueled search engines such as Perplexity present an alternative way to find information and get things done – two tasks firmly at the center of Google’s core business. The newly announced tools can be seen as an effort to prove its nearly 30-year-old search engine isn’t losing relevance in the AI era.

“What all this progress tells me is that we are now entering a new phase of the AI platform shift, where decades of research are now becoming reality for people, businesses and communities all over the world,” Sundar Pichai, CEO of Google and its parent company Alphabet, said in a press briefing ahead of the conference.

It's worth noting that AI Mode and everything that comes with it may have their features overlap with those available in Google’s Gemini assistant, potentially causing redundancy and confusion among consumers.

Robby Stein, vice president of product for Google search, addressed this by saying that Google Search's AI Mode is more about learning and finding information from the web, whereas Gemini is meant to be a helper for tasks like generating code and writing business plans in addition to answering questions.

Published: 
22/05/2025