Google Search Is Growing Up

Google held its annual I/O developer event this week. The company gathered software developers, business partners, and folks from the technology press at Shoreline Amphitheater in Mountain View, California, just down the road from Google corporate headquarters, for a two-hour presentation. There were Android announcements. There were chatbot announcements. Somebody even blasted rainbow-colored robes into the crowd using a T-shirt cannon. But most of the talk at I/O centered around artificial intelligence. Nearly everything Google showed off at the event was enhanced in some way by the company’s Gemini AI model. And some of the most shocking announcements came in the realm of AI-powered search, an area where Google is poised to upend everyone’s expectations about how to find things on the internet—for better or for worse.

This week, WIRED senior writer Paresh Dave joins us to unpack everything Google announced at I/O and to help us understand how search engines will evolve for the AI era.

Show Notes

Read our roundup of everything Google announced at I/O 2024. Lauren wrote about the end of search as we know it. Will Knight got a demo of Project Astra, Google’s visual chatbot. Julian Chokkattu tells us about all the new features coming to Android phones, Wear OS watches, and Google TVs.

Recommendations

Michael Calore is @snackfight. Lauren is @LaurenGoode. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.

How to Listen

You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how:

If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for Gadget Lab. If you use Android, you can find us in the Google Podcasts app just by tapping here. We’re on Spotify too. And in case you really need it, here's the RSS feed.

Transcript

Note: This is an automated transcript, which may contain errors.

Michael Calore: Lauren.

Lauren Goode: Mike.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Michael Calore: How often do you let the bots do stuff for you?

Lauren Goode: Initially I would say that I don't really, but then when I think about it, I use them a lot. I use Slackbots. I don't know if this is a bot, but I love the Gmail Smart Replies.

Michael Calore: That counts. Yep.

Lauren Goode: Sounds good. Thanks. I'm positive that I've talked to customer service bots, as we all have. What about you?

Michael Calore: I do it all myself. I can't let go. I have some calendar automations, but mostly I just don't trust the bots. Also, I really enjoy being in control.

Lauren Goode: This doesn't surprise me about you. I have bad news for you.

Michael Calore: Uh-oh.

Lauren Goode: Have you heard the word agentic?

Michael Calore: Like relating to an agent?

Lauren Goode: Yes.

Michael Calore: I think only important people like you, Lauren, have an agent.

Lauren Goode: Yes, but now everyone is going to be using agents. It's like the next level of bots. They're going to be running everything.

Michael Calore: OK. So big tech companies won't let us put in the work anymore to actually read or find or watch anything. They want to automate everything and speed up our lives whether we're ready for it or not.

Lauren Goode: Yeah, you definitely made the logical leap there. We should talk about it.

Michael Calore: Let's do it.

[Gadget Lab intro theme music plays]

Michael Calore: Hi, everyone. Welcome to Gadget Lab. I am Michael Calore, WIRED's director of consumer tech and culture.

Lauren Goode: And I'm Lauren Goode. I'm a senior writer at WIRED.

Michael Calore: We are also joined this week once again by WIRED senior writer Paresh Dave. Welcome back to the show, Paresh.

Paresh Dave: Hey, Mike. Excited to be in this new cozy space with you all.

Michael Calore: You like it? Do you like our magazines?

Paresh Dave: Very close to the magazines. Very, very close to the magazines.

Lauren Goode: In fact, there's one in there that has Paresh’s recent story on Reddit, right?

Michael Calore: Oh, yes.

Lauren Goode: Shameless plug.

Michael Calore: Somewhere. Our studio is in the library. Just in case those of you listening on the audio medium did not pick up on the fact that we're surrounded by WIRED magazines. Since Paresh is here, you know what that means. Today we are talking about Google. The company held its I/O developer event this week. There was a big in-person gathering near its headquarters in Mountain View, California. There were Search announcements. There were Android announcements. Somebody got on stage and blasted rainbow-colored robes into the crowd using a T-shirt cannon. But the talk of I/O was artificial intelligence. Nearly everything Google showed off at the show was AI-enhanced in some way. There was AI search, AI-powered chatbots, AI-generated videos, AI-powered restaurant recommendations. Most of the stuff is reliant on Gemini, Google's large language model, which also got some upgrades this week. We're going to talk about that. We're going to talk about Search. But first, Lauren and Paresh, you were both there at Google's two-hour-long, AI-packed keynote. Set the scene for us a little bit. Did this year's I/O feel different than last year or the year before?

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Paresh Dave: AI-packed, yeah. There were 120 mentions or 121 mentions of AI, according to Sundar Pichai, Google CEO, who used AI to analyze the script for the show. But no, I thought it was a little bit more crowded, as Lauren noticed as well, this year—a few more people but a little underwhelming and confusing, I would say, than past years. And less talk of some of the other Google products, like no mention of Maps, for example, this year.

Michael Calore: Wow.

Lauren Goode: Yeah, that part was interesting. Google I/O has traditionally been fairly verticalized, right? There would be announcements on Maps, a lot of Android announcements, Search, Google Cloud and Workspace and things like that. Now it was structured like there was this horizontal theme of AI, AI, AI when—we just made it 124 times mentioned. And then from that it was like, “Let’s examine every product line at Google or nearly every product line at Google and how it's being infused with AI.”

Michael Calore: Right. So the big thing that is going into all of these tools is Gemini, right?

Lauren Goode: That's right.

Michael Calore: Can we talk about what Gemini is and how it fits in?

Lauren Goode: Sure. Gemini is Google's Frontier Model, as they call it. It's their foundational model for this era of generative AI. Google will always make the point that it's been working on AI for over 10 years now, and if you go back to 2017, I'm going to give another shameless plug for a WIRED story, Steven Levy recently wrote a great story in the magazine about the eight AI researchers who were behind this paper in 2017 that was basically about transformers, which is the T in ChatGPT. It was some of the foundational technology that is powering this new generative AI era where AI isn't just machine learning, but it's sort of taking it to the next level because it's relying on large language models. Google's version of that is Gemini, and it's the name of the model, but it's become kind of this marketing phrase too. Like one of the things we saw yesterday were these little actions within apps called Gems. And so Google is taking that branding and extending it throughout its product verse.

Paresh Dave: I feel like I got to roll up my sleeves here because this is one of my annoyances. There's multiple versions of the Gemini model, so depending on which product you're using and how much you are paying for it, including free, potentially, you could be getting a very different Gemini experience. If you're using Gemini on your phone, that's maybe a different experience and things are rolling out to different Google products at different paces. So you may be interacting with one version of Gemini in the Gmail that you use for your work if your company uses Gmail for its email, but then when you're using Gmail for your personal use, you might be interacting with another version of Gemini and the capabilities are a little bit different in all of those.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Lauren Goode: Yeah. Last year or over the past year I should say, if you were paying $20 a month for Gemini Advanced, which is kind of like the ChatGPT Plus model that people have been paying $20 a month for, or Perplexity Search Pro or whatever that people also pay $20 a month for. It's become this kind of standard pricing for, you're going to have mostly a consumer facing experience with this AI, but we're going to boost it a little bit. That was called Gemini Pro Advanced, I think, or Gemini Advanced.

Michael Calore: Just Gemini Advanced, I think.

Lauren Goode: And now that version is Gemini Pro 1.5. But then there might be a different version for developers that's a little bit faster that includes more tokens that these new AI models can run, which expands the context of the model. So this is something that I think is confusing generally throughout the AI industry right now, but Google has kind of a fun history of not doing a great job with naming its products, and we're seeing that extend into this too.

Paresh Dave: But they also have fun things like with these tokens. So it basically means how much content you can upload and have the model analyzed. I think it was like 160 Cheesecake Factory menus is what it can handle in the future.

Lauren Goode: I saw a demo of Gemini Pro 1.5 where they were uploading a 1,500-page stack of documents. One was a climate report from 2005 and another was a climate report from 2023, and they uploaded all the pages and then said, "Tell us the key differences between the report in 2005 and 2023." And then it only took, I don't know, I actually looked away and was talking to someone so I don't, maybe a minute, maybe a couple minutes, and then all of a sudden it just spit out all the bullet points. Whether or not that is entirely accurate is the big question around using these tools, but it was an impressive demo.

Michael Calore: Right.

Lauren Goode: That's like a thing that's happening right now too in the industry. Everyone says you can put together a pretty impressive AI demo. It's great for demos. Beyond that, there are a lot of questions.

Michael Calore: And there was a really striking demo that we saw at I/O this week, which was the Project Astra demo. Our fellow reporter, Will Knight, got the full breakdown of Astra and he wrote a story about it and he saw the same demo that we all saw on stage, which is through the camera. So you open up a smartphone and you point a camera at things and you can interact with the AI voice assistant and ask it to tell you what it sees. So you can point it at something and say, "Can you translate this for me?" You can point it at a piece of code and say, "What does this piece of code do? Can you walk me through it?" You can ask it to give you creative answers, like the person in the demo pointed it at their dog and a stuffed toy and said, "Can you give me a good band name for these two?" Pointed out the window, "Tell me what I'm looking at."

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

And it identified the neighborhood that they were in. And all of that seems really cool, but that's sort of what we would expect from these tools like, "Here's a piece of data, analyze it. Tell me what it means." The thing that I thought was the most interesting was that at the end of the demo, the person says, "Can you tell me where I left my glasses?" And it said, "Yes, they're behind you on the table next to the red apple." And the person was able to walk over and pick up their glasses, and then of course they put the glasses on and they're smart glasses and then the demo continued through the smart glasses. But that moment of the person in the demo saying, "Can you tell me where I put my glasses?" is really interesting because it introduces the concept of context, like it understands.

Lauren Goode: And memory.

Michael Calore: Yeah. And in memory, it understands where you are in the world and it's observing everything, not just the things that you're asking it about.

Paresh Dave: So there's a few things I can add to that. One is the memory, Google right now is being a little bit fuzzy on how much context it can keep. A minute, five minutes, an hour, that's a big difference between all of those, and they're not quite saying much on that. And then another thing to note is a lot of those demos right now are on sort of like a plain white background. So we got a chance to take some demos of this Project Astra technology yesterday, our colleague, Reece Rogers, put a T-Rex toy on a bare white table and asked it to asked Gemini a Project Astra version of Gemini to make a story about this T-rex toy. And it talked about T-rex exploring the great white expanse because that's what it viewed the table as. So I think that's something to keep in mind as well. And it's really, really early. So we were able to see some prototypes on a phone and let's just say it's a very rudimentary system right now that Google has, so definitely very early prototype stage.

Michael Calore: Right. And that demo—to Lauren's point, it's easy to make a good demo—that demo maybe demonstrates a piece of it that is not fully functional yet, but that it shows you that Google is thinking about that, and that's what it eventually wants it to be, right?

Paresh Dave: Yeah. The demos that we saw used overhead camera, so it's looking down on this white table. So there happened to be, while we were doing this demo with my colleague, Reece, there was a Diet Coke can that happened to be sitting there and Reece was like, "Oh, can I put this here?" And the Google employee who was running this demo was like, "Yeah, that camera won't be able to see that it's a Coke can because the label is on this side of it." There was another thing where there was a gemstone and there was a label below it and it was able to recognize the gemstone has an amethyst, but it couldn't read the label because the camera doesn't have enough resolution or whatever to read all of that. So I think there are a lot of limitations to how these demos are set up.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Lauren Goode: One thing I noticed when we were sitting there during the keynote is that Paresh, as soon as each product was announced, would basically go to the Google documentation for it and try to look up how it was being trained and what data was being stored. Because it is a big question around what is feeding these AI tools to make them so smart? And then also if we as consumers, as employees, as customers of these products, start using them more, we're making them better. Paresh, what did you learn so far about what kind of training data is going into some of these new AI products?

Paresh Dave: It varies by products, again, since you have to be very clear about the policies for every single one, but it seems like Google right now, in part because of European Union regulations that limit how data can be shared between products, Google is trying to sort of use data that you contribute to a specific service like Gmail only to improve Gmail as much as possible. So one of the things that they mentioned is for Google Photos, there's this new feature where you can ask for a specific type of thing from your library because there's so many photos we all have now that it's hard to sift through. And so you can talk to it using Gemini basically to find the photos that you're looking for.

And those conversations that you have with Gemini in photos will only be used to improve photos and won't affect other services. And Google saying your Google Photos will only be used to train Google Photos related algorithms. So I think we'll see more of that in part because of these European regulations that prevent sharing of data between products without proper sort of consent. And in many cases, Google doesn't like to get that consent, it just likes to do its thing.

Michael Calore: The world is changing for the better, we hope. All right, we do need to take a break and when we come back we'll talk about Search.

[Break]

Michael Calore: OK, Lauren, I want to switch it up now and talk about Search. You wrote the story for WIRED about how nearly everything we've gotten used to about Google Search is now going to change. Your story has an awesome headline, "It's the End of Google Search as We Know It."

Lauren Goode: Do you feel fine?

Michael Calore: I'm guessing you wrote that.

Lauren Goode: I did. That's my favorite karaoke song.

Michael Calore: Yes. I was going to say it's a very Lauren headline.

Lauren Goode: 1990s R.E.M. reference. But yeah, no, I mean it just was so clear to me when I started to see some of the changes that were being made to Google Search in pre-interviews and pre-briefs and then once again at I/O yesterday that things are really changing in Google Search.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Michael Calore: OK, like how?

Lauren Goode: So over the past year, Google has been experimenting with some changes to Search in its labs. It has rolled something out called the Search Generative Experience. It was limited to a certain number of users. We as journalists were given access and have been playing with it for a while. But what you started to see more and more were these AI overviews. You would search for something, whether it was where can I see Aurora Borealis or where can I find a yoga studio near me or what does the word penumbra mean or anything, lots of shopping people do a lot of shopping on Google or at least start there.

Michael Calore: Right.

Lauren Goode: And what you would see is a little snippet that was generated by AI, Gemini, a version of Gemini specific to Search that would give you a summary of information before it eventually got to other content. Sometimes that would be ads, sometimes that would be like little chips, sometimes it would be the famous 10 blue links that we've all gotten used to clicking through on the web. That AI overview is now just rolling out to everyone in the US in the English language. And Liz Reid, who's the new boss of Google Search, said that by the end of this year, they hope to roll it out to more countries and more than a billion people will be having this search experience. This is a big fundamental change in how Google Search will work.

Michael Calore: So this Search Generative Experience is something that I opted into. I'm sure a lot of people listening and everybody in this room probably opted into it at some point, and I just found it annoying. It was like, OK, it's kind of giving me what I want, but really I just want a link because forever and ever and ever as far back as I can remember, I typed something into a Google Search box and it gave me a link and that part of it seems like is really, it's going away and it's going to be replaced by these summaries, and the summaries were not very good, but they're getting better all the time and they've gotten very good very recently. Right?

Lauren Goode: I agree with you. They're taking a valuable real estate at the top of Search. Some of that is just habit. We're very used to Search operating a certain way for the past couple of decades. Some of that is to your point, that maybe the summaries aren't very good or in depth. One thing worth noting-

Paresh Dave: And take a long time to load.

Lauren Goode: And do take a long time to load seconds, which Liz Reid pointed out that time is valuable to people. They want information quickly. If they don't, they're going to go somewhere else. But still, it's taking a little while.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

But one thing that's definitely important to note is that Google is not going to show this for all search results. One example they gave is if you just want to go to Walmart.com, but you happen to use Google as your front door to getting there typing in Walmart or Walmart.com, it's just going to take you to Walmart.com. It's not going to give an AI overview of the Walton family, we hope. But then there are other that are more complex or multistep questions that that is where they're more likely to show the AI overview. I think what this really underscores is that Google holds the keys. Google is going to attempt to make an algorithmically determined decision over what is complex and what is not, and then what has constituted high quality or high value content and what is not, and basically tell you this is how Search works now.

Paresh Dave: Yeah, I'm not convinced yet that overviews are here to stay, but I do think an experience like that, and I think an experience with 10 blue links, the list of search results just disappears completely and looks a lot different and looks something more like Pinterest with a bunch of pins floating around the page and a lot more images and visual. I could see that happening slowly and that is the march towards that. But I'd say the two things that I am most excited about where Search could go is the potential for more personalization because historically, Google's whole thing was they did not like to personalize search results based on anything more than your location.

If you're searching for Walmart, they try to show you the nearest Walmart or results about the equivalent store in your area, but they're talking more about saving things like your shoe size so that when you go to search for shoes, which they're apparently people who search for shoes every day, even though they don't buy them necessarily every day, news to me, so that it's easier to get the results that you want, you don't have to constantly be entering your shoe size or adding filters or whatever to your search results.

And I think there's potential to do more of that, and that's kind of exciting. And then the other thing is multilingual search results. So we in the US are a little bit elitist and mostly just speak English, but outside of here, a lot of people speak two languages. They constantly switch between those two languages and these models, like versions of Gemini, can enable these AI overviews to have a mix of say Hindi and English or French and English, whatever it is, and provide people answers more readily in the languages that they sort of speak.

But I think the problem with all of this is it's all limited by what content exists on the web. Gemini generates text, but when people are searching, they're searching for facts and the only facts that it can provide unless it's hallucinating and then it's not facts are what's online and what's on the internet. And in a lot of places around the world, there's not a lot of content. It's what the industry calls low resource or resource-poor. There's just not a lot of information, and I don't see anything coming out of Google recently about how they're going to solve that.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Michael Calore: Interesting.

Lauren Goode: Right. Google I/O was particularly interesting this year because per usual, there was the juxtaposition of press with Google executives, it's an opportunity for us to talk to some of them in person and get a sense of what they're thinking about or how they're thinking about the future. This year, there were a lot of questions about what it's like to be a web publisher in the age of the new Google, and that has impact for media both large and small, and it has impact potentially for advertising as well, I think.

Michael Calore: And for readers.

Lauren Goode: And for readers. Absolutely. Two things that I think are going to be big questions as Google decides how to tune AI overviews into precious point, even if they decide to keep it, is what kind of traffic it ends up sending to web publishers and web creators and bloggers and small local news sites and all of this stuff. Look, there's a lot of spam and chum on the internet too. Not all of the sites out there are high quality. I don't know if you knew that, but yeah, the question is whether or not if you give these AI overviews, people are then curious enough to continue clicking or are they just going to be satisfied with the AI overview and Liz Reid of Google said that Google does believe that people will then continue to scroll down the page or have their curiosity piqued and then continue to click around and maybe click on your blue link. I'm not quite sure that that's going to be the discovery process. I'm not super optimistic about that. We will see.

Paresh Dave: Yeah, I think part of their pitch is that right now we mostly go to Wikipedia or these big websites and their hope or what they're feeling is happening, at least in some of these tests, is people are going to smaller websites that have similar information about a topic that you're looking for but maybe buried down in the list of links normally. And because of the way Gemini and these AI overviews work, they're sort of promoting them right alongside Wikipedia, and so you're more likely to visit these smaller websites.

Lauren Goode: Right. In terms of advertising potentially it could help boost advertising on the web if advertisers are given a certain set of AI tools as well, and it kind of levels everything up. Paresh reported recently on earnings from Meta, and one of the things that Meta noted is that their revenues had gone up in advertising partly because of the way AI is helping to boost those ads. And so is there a potential for that to happen on Google? Sure. The way our business model works, of course, is that we also are reliant on advertising, but we're also reliant on Google or other kind of front doors to the internet sending us traffic.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Michael Calore: Yeah, like you were just saying, you're talking about all of these existential questions for publishers and for the advertising industry and for readers, and Google holds the keys to all of them.

Lauren Goode: Right. Google and Meta.

Michael Calore: So whatever they want to send us is we're just going to have to take it.

Lauren Goode: Yeah, absolutely.

Michael Calore: The thing that was really striking to me when you think about that is yesterday we really got an overview of how Google is thinking about search, and most of us think about search as you Google something, you type something into a box, you get a result, they're thinking about search through the camera, they're thinking about it through a voice assistant. There's circle to search on Android phones where you can circle something on your screen and it'll search for it visually. There's all these different ways of thinking about search that Google is really putting all of its weight behind, and obviously all of it is AI-powered and all of it is just going to change the way that we use the internet slowly.

Lauren Goode: I don't think Google thinks this is an existential threat, but I do think the emergence of tools like OpenAI's ChatGPT and Perplexity Search, which has gotten a lot of hype and even Brave, there are going to be more "search engines". I say quotes because it's hard to say how capable they will be, but built on top of open source AI models or claiming to be more private versions of these, and they're just getting a lot of attention right now, and I think Google probably feels like even if they just siphon off single digit percentage points of market share or mind share for Search, I think they see that as still like, "That's a lower quality search experience. We can make our search better. We have this knowledge graph that includes location, that includes real-time information that includes all these things that maybe these other tools don't have. And so we have this commitment now to just layer that in across all of our products, and that's how we're going to continue being dominant."

Paresh Dave: The wrinkle I'd throw into this is there's also, in addition to being a Frontier Model, there's also the app called Gemini. And Google right now is viewing that as sort of another outlet for Search. So Gemini is a chatbot experience, right? Very similar to ChatGPT. And so Google now has Search, then they have Gemini, the chatbot experience that's slowly subsuming what was once known as Google Assistant and still is known as Google Assistant, but wasn't mentioned a single time yesterday again for the second straight year, I think.

And it seems like Gemini is on the path to replacing Google Assistant completely, but then like Mike mentioned, there's visual search. You can do visual search in the Gemini app or Gemini.com or whatever it is, and then you can also use Google Lens, which is more akin to a traditional search product. So Google is viewing these as different experiences and thinks that there are separate markets for all that. But again, it just potentially adds to the customer confusion here because you could be potentially having a worse experience in one of these services and not realize that there's a better chance to get what you're after using one of these other experiences and potentially puts people at a disadvantage.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Lauren Goode: Paresh, do you think you're going to be using Pinterest more than Google Search now?

Paresh Dave: No.

Lauren Goode: I think Paresh is a secret Pinterest fan.

Michael Calore: Not so secret anymore. All right, well, this has been a really great conversation, but we do have to wrap it up because we have to make room for recommendations. So let's take a break and we'll come right back.

[Break]

Michael Calore: OK. This is the part of the show where we go around the table and everybody tells us about a thing that they're enjoying that our listeners might also enjoy. Paresh, you get to go first. What's your recommendation?

Paresh Dave: It was something that I was listening to on my way into taping this podcast, which was the podcast called The Pitch. It is similar to Shark Tank where a startup comes in and, in this case, talks to a diverse group of venture capital investors about their company, and the investors then have a discussion about whether to invest. And they all make individual decisions, and then they also tend to have a follow-up about how things went after the meeting, which you don't get on Shark Tank normally. These are very frank discussions. The host is very engaging, and the startups tend to be pretty weird and interesting, at least to me. Recently there was this company that is basically renting out spaces in churches on days that the church doesn't have services, and it's like a huge business for churches—or turning into a huge business for churches. The most recent episode, which I listened to today, was about AI to help with podcast editing and improve podcasts and stuff. So it seemed very on-brand.

Michael Calore: Boone is in the other room taking notes.

Lauren Goode: Are we out of a job yet?

Paresh Dave: No. It's supposed to make our jobs easier, like all AI promises.

Michael Calore: Sure. OK, that's great.

Lauren Goode: That's pretty great.

Michael Calore: The Pitch.

Lauren Goode: Are you as a listener able to invest in any way? Is there a Kickstarter element where you find something really interesting and then you go?

Paresh Dave: Not that I know of, but one of the funny quirks is that the show itself has a fund, and they invest from the show's fund once in a while, but I don't know if they solicit outside investment for that.

Lauren Goode: I'm surprised someone hasn't come up with that yet. Shark Tank meets podcast meets Kickstarter.

Paresh Dave: Well, there's lots of rules around equity crowdfunding and stuff, so I'll leave that to the attorneys.

Michael Calore: Maybe you could bring that idea onto the show and see what they think of it.

Lauren Goode: Maybe. Maybe I'll bring it to Shark Tank and just we'll make it a whole or a chorus of weird startup pitches.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Michael Calore: That's the world we already live in, Lauren. What is your recommendation, Lauren Goode?

Lauren Goode: Those of you who are long-time listeners of the pod know that I occasionally recommend poetry. A few weeks ago I recommended Ben Lerner. I've recommended the works of Seamus Heaney and Ada Limón, the US Poet Laureate in the past. This week I have another poet who I want to recommend. Her name is Kristin Lueke. She is a Chicana poet and essayist. She's based in New Mexico. She's also a creative strategist, and she works with design studios and that sort of thing, but I've been subscribed to her Substack called, it might be on another platform now, is it a beehiiv? Well, it's one of the newsletter platforms and her newsletter is called The Animal Eats, and she publishes about once a month. So it's not an overwhelming number of newsletters in your inbox. It's not news. It's thoughtful and yeah, I think she's a really beautiful poet, and I highly recommend checking out Kristin Lueke. That's L-U-E-K-E.

Paresh Dave: It appears to still be on Substack.

Lauren Goode: Oh, OK. Thank you, Paresh. It is Substack.

Michael Calore: You sent me one last week.

Lauren Goode: I did.

Michael Calore: It was very good.

Lauren Goode: Yeah, really good.

Michael Calore: I thoroughly enjoyed it, and I want to thank you in person for sending it to me.

Lauren Goode: You're very welcome. That one really struck me, and I was immediately like, “I need to forward this to Michael.” Yeah.

Michael Calore: Nice.

Lauren Goode: Oh, I'm sorry. I was supposed to ask you, and what's your recommendation?

Michael Calore: I am going to recommend a film that also has a literary connection. It is a 2016 film called Julieta. It's a Spanish movie written and directed by Pedro Almodóvar, and I'm recommending it this week because it is based on three short stories by Alice Munro. Alice Munro is a Nobel laureate. She was a short story writer, and she just passed away a couple of days ago. We're recording this on Wednesday. I think she passed away on Monday. And I love Alice Munro's short stories, and I watched this movie in the theater and somehow I missed in the opening credits—aybe I was looking at my phone—that it was based on three short stories by Alice Munro, but about halfway through the movie, I had that aha moment: Oh, I know this story. And I thought that maybe it was just something that was familiar, but it turns out that it was actually based on her work. So the three short stories are Chance, Silence, and Soon, and they all appeared in the Alice Munro book Runaway, which came out 20 years ago. And Pedro Almodóvar used them as source material to build the movie, and the movie's excellent. I love pretty much everything Pedro Almodóvar does, and this is like top three Almodóvar movies for me. So yeah, I can highly recommend it: Julieta. And if you like Julieta, go read the book, because the book is also excellent.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Lauren Goode: And where can you stream the movie?

Michael Calore: Good question. Hold please. Let me ask Gemini.

Lauren Goode: We are really just leaning into our English major cred here. Whoever is the person who leaves our comments regularly on Apple Podcast saying we're a bunch of English majors, let me just tell you, you are correct, sir.

Michael Calore: You're absolutely right. So you can watch it on Starz, but nobody has Starz, so you can buy it pretty much everywhere else for 3 dollars 59 cents or $4. So it's under $4 to rent it everywhere you stream.

Lauren Goode: That's great.

Paresh Dave: Did you ask Gemini? Because I asked Gemini and it said, "You can rent or purchase Julieta on various platforms including Amazon Prime Video, Apple TV, Google Play Movies, YouTube, and Voodoo."

Lauren Goode: No shout-out to Starz.

Michael Calore: It didn't say anything about Starz.

Paresh Dave: It did not. I also don't know if Google Play Movies even exists anymore.

Michael Calore: It does. It does. If you have Google TV.

Paresh Dave: I thought they turned it into YouTube. I thought they subsumed it into …

Michael Calore: Oh, you know what? Maybe you're right. Maybe you're right.

Lauren Goode: YouTube TV or YouTube Live or whatever that subscription is called. I don't know.

Michael Calore: I will tell you that you can rent movies through Google Play.

Lauren Goode: You can.

Michael Calore: I'm looking at the who at the page right now.

Lauren Goode: OK.

Paresh Dave: But that name isn't used anymore.

Lauren Goode: But that's the App Store.

Michael Calore: Right. It's just called Google Play and it's games, apps, movies, TV, books and kids.

Lauren Goode: Ah, OK, OK. The way that … Oh, that's interesting.

Michael Calore: Yeah.

Lauren Goode: Have you watched the new Anne Hathaway movie yet?

Michael Calore: Not yet, no.

Lauren Goode: It's delightful.

Michael Calore: OK.

Lauren Goode: Yeah.

Michael Calore: Also on my list. Also, not based on Alice Munro short story.

Lauren Goode: No, no, no. I just really enjoyed the conceit of a 40-year-old woman having a relationship with a 24-year-old.

Michael Calore: That's a story for another podcast. Maybe you can recommend it next week. Maybe I can recommend it next week.

Lauren Goode: All right.

Michael Calore: Oh, sorry, I thought you were going to show me the Google TV thing.

Paresh Dave: Oh, yeah, yeah, yeah. Google Play was saying you cannot use the Google Play app to buy movies and TVs anymore. You have to go to Google TV app.

Michael Calore: On the web.

Paresh Dave: This is Google. Nothing makes sense.

Michael Calore: On the web, you can still go to play.google.com and buy movies.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Paresh Dave: Nothing makes sense.

Michael Calore: Nothing makes sense.

Lauren Goode: I wonder how many people actually go type in actively in the URL, play.google. What is it? Com?

Michael Calore: Yeah.

Lauren Goode: I mean, I guess you have to be a super fan to know that that's the URL.

Michael Calore: Probably. I still type YouTube all the time.

Lauren Goode: Yeah, I mean same. Yeah, it's the end of Search as we know it folks.

Michael Calore: And I feel fine. All right, well, that is our show for this week. Paresh, thank you for joining us as always.

Paresh Dave: It was great.

Lauren Goode: Thanks, Paresh.

Michael Calore: We hope to have you back soon. Yeah, thanks for coming, and thank you all for listening. If you have feedback, you could find all of us on Google+. Just check the show notes. You can also leave us reviews on Apple Podcast. Lauren loves responding to the reviews on the show, so please leave her review. Maybe she'll say your username. Our producer is Boone Ashworth. We will be back with a new show next week, and until then, so long.

[Gadget Lab outro theme music plays]

About

Check Also

How to Preorder the PS5 Pro (Before a Scalper Bot Does)

We’re barely done with the years-long period where it was almost impossible to get your …

Leave a Reply