Google Search Is a Mess. Can Mobile AI Make It Better?

In recent years Google has used the word “helpful” to describe new features added to its search product, its voice assistant, its generative AI tool Bard, even its Pixel earbuds. A keyword-search for the word “helpful” in Google’s own corporate news blog brings up more than 1,200 results.

Depending on what you’re searching for, though, Google’s main search service has become less helpful. To hear one columnist describe it, Google search is now a “tragedy” that is “bloated and overmonetized.” The Financial Times notes that it’s “cluttered with adverts”—less encyclopedia, more Yellow Pages. One prominent ex-Googler blames the lowered quality of Google search on the degradation of the web itself—not explicitly Google, which still offers the world’s information for free at our fingertips. And one recent study of product reviews results shows that, despite indications of lower quality results across search, Google actually performs better than some of its competitors.

But it doesn’t take a group of researchers or the credentials of a top technologist to run a quick Google search and notice that the first few results, at least, are ads, with more clutter appearing below the digital fold.

Google, like other tech giants, now sees generative AI as a tool for streamlining and expediting search and is now straddling the fine line between making search genuinely smarter and further mucking up its already overstuffed user interface. Its latest announcements around generative AI on mobile search are part of that experiment: Is it possible to make Google search more convenient, more accessible, even if the company is still committed to the same ad strategy?

Later this month, high-end Android phones—Google’s own Pixel 8 and Pixel 8 Pro along with Samsung’s brand-new Galaxy S24 phones—will get a few new AI features that integrate search (and Google Lens, the company’s image-recognition app) directly into other apps on the phone. One of those features is called Circle to Search, which lets you use touch to select images, text, or videos within an app and run a quick search in an overlay that appears at the bottom of the screen.

An example Google gave in an early demo was a text-message exchange between friends, where one friend suggested a restaurant and the other was able to Circle to Search it and pull up results for the restaurant without leaving the text app. Another use case would be pausing and Circling a product you spot in an Instagram video and running a search for that product, again all within the same app display.

Both of these use cases are examples of a certain efficiency in search—a kind of helpfulness, if you will—because they allow the user to run searches without switching between apps. But they also present obvious commerce opportunities (which is often what Lens is used for, in addition to nature-spotting), which means they’re good for Google’s ad business. Google confirmed that Search and Shopping ads will continue to appear in dedicated ad slots in the results page. Given that the search overlay will only take up a fraction of your mobile display, if the results are ads it could quickly end up being more frustrating than efficient.

That’s where generative AI comes in: A summarized response might make more sense on limited screen real estate, rather than a series of links. Google’s new AI-powered multi-search function does something similar to Circle to Search, just with a different input. When you use Google Lens now—the visual search option within the Google mobile app—by pointing your phone at an object, the results will include “AI-powered insights” in addition to the search results you’d already expect.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

The example Google used was a board game: Spot a game you don’t know, snap a photo of it, ask “How do you play this?” and Google’s AI will spit out an overview. Another option: Pointing the phone at a broken appliance and asking “How do I fix this?”

“In my mind this is about taking search from multi-modal input to really doing multi-modal output as well,” says Liz Reid, vice president and general manager of search at Google, referring to the various means by which humans can interact with a computer or AI model to produce potentially more relevant results. “It really unlocks a set of questions that previously you couldn’t just ask Google.”

Unlike Circle to Search, AI-powered multi-search results won’t require enrollment in Google’s SGE, or Search Generative Experience, a portal where early testers can get access to new AI tools. The AI-powered multi-search will be available on any iOS or Android phone in the US running the Google app. However, people outside of the US who are using Google’s SGE can also get a preview of AI-powered multisearch.

These are incremental updates, but that’s characteristic of Google’s approach to SGE, where the company has been dogfooding some of its latest and most advanced AI search features before deploying them more widely. Bringing early users into SGE not only feeds Google more data that can train its AI models, but it also gives Google some wiggle room if the product isn’t perfect just yet. Reid says there likely isn’t going to be a light-switch “moment” when the SGE experience fully replaces Google search as we know it; rather, it’s “pushing the boundaries of what’s possible and then thinking about which use cases are helpful and that we have the right balance of latency, quality, and factuality,” Reid says.

Approaching a whole new era of search this way is certainly helpful to Google. In an ideal AI future, it would be more helpful to those searching, too—on both mobile and the web.

About Lauren Goode

Check Also

The Hottest Startups in Helsinki in 2024

Helsinki’s startup scene evolved around behemoths such as Nokia, games giant Supercell, and food delivery …

Leave a Reply