Your Photos Arent Real

At a splashy media event this week at its headquarters in Mountain View, California, Google announced four new Pixel phones. But the most important stuff unveiled at the Made By Google event wasn’t the hardware itself, but rather all of the generative AI tools packed into the devices.

Most notable are some AI-powered camera features that allow Pixel owners to easily add their own image to a group shot after they’ve taken the photo, or to alter a photo entirely by changing night to day and adding objects that were never really there. It’s an exploration of our limits—how convincingly technology can bring alternate realities to life, and how much of the computer-generated scenery we can tolerate.

This week on Gadget Lab, WIRED senior reviews editor Julian Chokkattu joins the show to talk about Google’s fancy new photo tricks. We also talk about Gemini Live, the latest iteration of the company’s AI-powered voice chatbot. Finally, we ask the unaskable: Is Google Assistant finally dead, or just banished to Google’s attic?

Show Notes

Read more about all the new updates from the Made By Google event, including Google’s Pixel camera updates. Learn how the company is using AI to reshape reality. There are some potentially life-saving new features on the Pixel Watch 3. Also read Reece Rogers’ WIRED story about ChatGPT’s advanced voice mode and Jia Tolentino’s New Yorker story about tweens and Sephora.

Recommendations

Julian recommends folding flip phones. Lauren recommends Colorescience Sunforgettable Total Protection Face Shield Flex SPF 50 sunscreen lotion. Mike recommends the audiobook version of All Fours by Miranda July. (You can listen to it in Spotify Premium.)

Julian Chokkattu can be found on social media @JulianChokkattu. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.

How to Listen

You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:

If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for Gadget Lab. If you use Android, you can find us in the Google Podcasts app just by tapping here. We’re on Spotify too. And in case you really need it, here’s the RSS feed.

Transcript

Note: This is an automated transcript, which may contain errors.

Michael Calore: Lauren.

Lauren Goode: Mike.

Michael Calore: How do you feel about the little voice assistant that lives inside your phone? Do you use it?

Lauren Goode: You mean Siri? Are we still talking about Siri? It’s been 14 years at this point that we’ve been talking about Siri. 13 years.

Michael Calore: Whichever one.

Lauren Goode: Yeah. No. I do, but not often.

Michael Calore: Yeah. What do you use it for?

Lauren Goode: I’ll use it when I’m driving and I need some kind of response. I use a Google Assistant device at night to play soothing sounds before bed.

Michael Calore: Oh, that’s nice.

Lauren Goode: Cooking. Very helpful for cooking. I think Siri now does multiple timers. Is that true? Maybe.

Michael Calore: Probably. It seems like a feature.

Lauren Goode: We live in the future. How do you feel about them?

Michael Calore: Google, which in my job is a lot, my phone lights up and thinks that I’m talking to it, so that’s always fun.

Lauren Goode: What we’re both saying is they’re hyperintelligent.

Michael Calore: They are.

Lauren Goode: They’re useful.

Michael Calore: They love to listen.

Lauren Goode: They love to listen.

Michael Calore: Yes. But they are evolving in all kinds of new ways, which we saw this week.

Lauren Goode: We should talk about it.

Michael Calore: We absolutely should.

Lauren Goode: Let’s do it.

[Gadget Lab intro theme music plays]

Michael Calore: Hi, everyone. Welcome to Gadget Lab. I’m Michael Calore, WIRED’s, director of consumer tech and culture.

Lauren Goode: I’m Lauren Goode. I’m a senior writer at WIRED.

Michael Calore: We’re also joined this week once again by WIRED senior commerce editor Julian Chokkattu. Welcome back to the show, Julian, in the flesh.

Julian Chokkattu: Thank you very much for having me.

Michael Calore: It’s good to have you here.

Lauren Goode: Julian is in studio. Also, for those of you who can’t see, Julian is wearing the coolest bomber jacket.

Julian Chokkattu: Thank you very much.

Lauren Goode: Plaid patchwork purple. A lot going on.

Julian Chokkattu: It was made by a musician who created a fashion label. I’m blanking on his name.

Michael Calore: It’ll come you.

Lauren Goode: We can link to it in the show notes maybe.

Michael Calore: This week Google showed off all of its new Pixel devices. It held a big media event at the company’s headquarters in Mountain View, California. Julian, you were there. Lauren and I watched it on YouTube from the comfort of a couch in the office. There are four new Pixel phones, two new smartwatches, some earbuds, and while new hardware is always exciting, the hardware was not the most important part of the presentation. Google also showed us a bunch of new AI software features that come with the new Pixel phones, like for example, a voice assistant that can hold a normal conversation in a way that’s actually impressive.

We also saw some AI tools that can manipulate your photos by removing elements that you don’t want in the frame, or weirdly, adding in people or objects that weren’t there in the first place. In the second half of the show, we’re going to talk about how these new announcements advance Google’s position in the AI arms race or not, but first we’re going to dive into these new AI features.

Julian, you’ve demoed all of this new Google stuff and you were there in person. I know that some of it is stuff that you can’t talk about until you write your review of the phones, which is coming up shortly, but I’m sure there’s plenty that you can talk about. Where should we start?

Julian Chokkattu: Yeah. I guess as you say, the AI features are more or less what everyone is going to be talking about a lot, because the hardware has been polished to a degree where they’re all pretty great. Phones are generally great at all the things that you want them to do.

It’s not like before where we were waiting with bated breath on how much better the camera is, or maybe that was just me. But it’s a lot of these features and that’s a lot of the conversations that I had with Google. A lot of the stories that I wrote revolved around the camera experiences outside of capture, like post capture.

The things that you’re doing to your images. For example, Reimagine is this new feature that you can basically type in text into the prompt after you take a picture and then it’ll generate something out of nowhere into your photo. For example, I’ve been doing UFOs a lot for some reason.

Michael Calore: Flying saucers?

Julian Chokkattu: Yeah. It’s just my go-to prompt for some reason. I take a selfie of myself, choose the area that I want the UFO to appear, I type in UFO, and it comes up with a couple of selections, and there you go. Now I can prove that they do exist. They’re here on earth, but you can do anything.

You can change the sky from being cloudy to sunny. You can remove something from the picture and replace it with another object maybe. There’s obviously some guardrails to this. For example, they said you can’t change your facial features. Anything you try to do with a human probably will not work at all.

But that’s the way where they’re thinking of it in that they’re giving you that luxury to start altering your photos. Yes, there’s a watermark, so it does indicate that it was edited with generative AI, but I’m sure it’s not that hard to remove or figure out how to go around that.

Lauren Goode: There was also a feature Google showed off on stage. Jimmy Butler from the Miami Heat showed up in person. The Google PMs took a photo and then they took a photo of Jimmy, and then the two photos just merged and it looked like they were all standing together in a photo, even though they weren’t actually. You superimpose a person into your photo.

Julian Chokkattu: Yeah. That’s called Add Me, and honestly, this is the one feature I just was, when I first saw it, I was like, “That’s going to be something that is so useful to everyone.” Because we’ve all been in that situation where you end up with two separate photos where you’re with your family or friend or loved one and you’re standing in front of a building or some notable landmark.

One person takes the photo and someone else swaps out, and now you just have these two photos of one person is missing still in that photo. Unless you ask someone to take a photo of your whole family, which also there’s a slim chance someone just runs off with your phone.

Lauren Goode: I’m that person who does that, by the way, if I see people trying to take a photo, I always offer.

Julian Chokkattu: That’s nice. That’s nice.

Michael Calore: Me too. Me too.

Lauren Goode: I’m just like—

Julian Chokkattu: People just ask me, because I have a camera usually, and then, “He’s a photographer. Ask him.”

Lauren Goode: They’re like, “He’s wearing a cool jacket. He must be a cool dude.”

Michael Calore: He looks like a nerd. I get that. I get that.

Julian Chokkattu: But Add Me basically does let you superpose it so that you don’t have those two separate images that you don’t do anything with. You just get one image and it puts the person next to you, and it structures the photo so that you’re aware of where the original photo was.

You’re told to place the second person in the right area and then it just does its AI magic thing and puts a picture of you both as if you were standing together. It’s just a helpful feature in that way, but that’s the sort of thing that Google is giving you that luxury of altering your photos.

They don’t mind it like that, because it’s just you’re blurring the truth a little bit. But it doesn’t matter at the end of the day, because apparently you technically were standing together at some point in time in front of that object.

Lauren Goode: Right. One of the wildest things that a Googler said to you in your pre briefs, and Julian ended up writing this great story that we’ll link to in the show notes about photo capture and memories and altering them. This person said that our actual memories from events are fallible and that sometimes the way we remember them isn’t actually the most truthful, but that if you do the photo like this, you use the Add Me feature. I want to call it the pick me feature. The Add Me feature. Pick me. Love me. That could be a more accurate representation of the truth, which I found fascinating.

Julian Chokkattu: Yeah. It’s weird, because it’s not untrue. There are sometimes memories where you’re not quite remembering exactly how it was. But to then go back and say, “Well, this is how I want to remember it and I’m going to just change it to what it was.”

Even though technically that image that you’re looking at was the truth, that was what the scene was. It just maybe doesn’t live up to your expectations of what it truly was in your head. I don’t know. It’s a weird game to play.

Then there results of the fact of when you’re then doing that, you’re kind of cleaning it up into what you want it to be, and so it loses that authenticity, of course. Which is a funny thing, because it’s one thing where they’re adding all these features that break the authenticity of the image technically, but then on the camera side of things, they’re doing all of these things.

One thing that I was talking to the Pixel camera lead, and he said a lot of the changes to their HDR pipeline were now to make the photos look more natural, because that’s what people want. As opposed to, say 10 years ago when we were all super into superfiltered Instagram images and you just wanted your camera directly to process the image.

Lauren Goode: Or hypersaturated.

Julian Chokkattu: Yeah. The colors and all that. Now, it’s like, well, people want pictures that look kind of normal and not that. Then now, there’s AI features that kind of also just do wackadoodle. You do whatever you want with your photo.

Michael Calore: Yeah. I feel like I’m going to come out and say that altering your photos is bad and that you should not add people into your photos that were in your photos. I agree with you that I think it is going to be something that people are really going to love for that family photo moment or the group of friends sitting around a dinner table moment. It’s going to be nice for those folks, but photography is about capturing the decisive moment. It’s about taking an image as it exists in the world and not altering it.

Lauren Goode: Yeah. It doesn’t change the physical element or the framing or even the burden of who captures the photo. That still has to happen. There are those memes out there of how moms never end up in photos with their families, because they’re the ones most often taking the photos, and that’s not going to change. You’re just going to shoehorn mom in there now and make it look like the perfect happy photo.

Michael Calore: Right. If this takes off, which to Julian’s point, it probably will. It’s a Pixel exclusive feature right now, so it’s going to exist on, what, 4 percent of smartphones in the world.

Julian Chokkattu: I think that’s generous. Perhaps 0.0 something.

Lauren Goode: But these are the kinds of things that will probably eventually trickle down to other phones.

Julian Chokkattu: Yes. That’s the problem also with some of this stuff is that it’s not about what… Google obviously is trying to keep some of this stuff in mind and they have guardrails for a lot of these features, but eventually these are the features you’ll then see someone is going to make an app that can do the same thing.

They don’t care about any guardrails or watermarking or any of that stuff, and then you can introduce more bad actors to do more nefarious things perhaps and just lower that barrier of entry. That’s the whole point of what Google is trying to do is they’re seeing not the camera and improving the camera necessarily, but the workflow.

There’s so many things currently. A lot of the stuff that they’re adding is stuff anyone who’s a professional can do with something, Photoshop or Lightroom or something like that. What they’re seeing it as, “Well, we want to make it so that anyone, you don’t need that technical know-how.”

You can just do it on your phone, because it’s lowering the barrier of entry. But by doing that, you’re also then, I think it’s inevitable for anything, where now there’s going to be someone that’s going to use it for not the best intentions.

Lauren Goode: Right. Yeah. We should caveat that with we’re taping this from a Conde Nast building, and Conde Nast, which lots of magazines. In the history of magazines, this has been done at the professional level. Our own Steven Levy here at WIRED was once part of an infamous Vanity Fair shoot of very important people in the tech sphere, and I’m pretty sure that he was photoshopped next to people he had not actually been next to in the shoot.

Michael Calore: Oh, yes. Whenever you see those magazine covers where it’s the 20 hot Hollywood people or whatever, there’s no production assistant in the world who can get all of those people into a room at the same time unless it’s at the Oscars.

Lauren Goode: Absolutely.

Michael Calore: That’s all comped.

Lauren Goode: Right. That’s what we do.

Michael Calore: That’s all shot individually or in groups and then comped together.

Lauren Goode: Maybe the explicit agreement with the reader or the audience in that case is that you’re selling a fantasy, and now we’re all just going to be selling our own little mini fantasies, which we’ve been doing on social media.

Michael Calore: Yeah. To be clear, that is exactly what the Pixel is doing. It’s selling you a fantasy of what your reality actually is. It sounds small, like you were saying, Julian. It sounds like something that is very tidy for some people to use for that one instance, but it’s still a fantasy.

Julian Chokkattu: Yeah. But to your point on the number of people who can actually use it since it’s only on the Pixel nine so far. For example, two years ago they introduced Magic Eraser where you can remove stuff from your photos. Now, that’s the staple. OnePlus phones, Samsung phones, all of these phones have it. I think now Apple also has it in iOS as well. It’s just someone starts it, it’ll inevitably end up in other phones.

Lauren Goode: Right. Now, to your point about authenticity, we should mention Google is actually doing some pretty tremendous work around skin tones.

Julian Chokkattu: Yes. I had a lovely chat with Florian Koenigsberger, who is Google’s image equity lead. He is the person who spearheaded their Real Tone project, which was to help make skin tones appear more natural in photographs, because it’s kind of been a long time issue where darker skin tones don’t always appear naturally when you take a photograph, because the equipment, the processing is all tuned to more lighter skin tones.

This year there’s some notable upgrades, and I kind of did a little photo tour, got to see the little studio in Brooklyn that they do a lot of their testing. They had models in there, different types of lighting conditions, and they’re just constantly taking these pictures, comparing them to old models of phones, and improving it every year.

The most notable upgrade this year was for example, a darker skin person standing in front of a backlit window. Right now, if you try to take a video, their skin just gets really dark where you can’t see their facial features, and then it makes the background look normal.

But now, it prioritizes faces, so it actually understands that, “Hey, let’s preserve the person’s skin tone and their facial features without cranking up the shadows or over brightening the background.” It does a bit more of a balance and you’re not seeing those huge swings in exposure.

It’s little things like that where it’s just addressing those things that a lot of people might not care about or notice, but some people probably do and just kind of ignore it, because that’s what life is. But now, it’s like, “Cool. Well, at least now that doesn’t happen, that I actually can experience a normal photo or have that experience.”

The changes that they’re making in Real Tone also affect lighter skin tone people, too. There are improvements that translate across toward every type of skin tone because of their explicit focus on darker skin tones. It just elevates the entire thing to make it so that everyone’s skin tone just looks a little more natural.

Michael Calore: Nice. Well, we made it halfway through the show without talking about chatbots, so I think we should all be very proud of ourselves. But let’s take a quick break and when we come back we’ll talk about Gemini Live and generative AI a little bit more.

[Break]

Michael Calore: Julian, setting aside all of the photo stuff that we talked about, the good and the bad, these phones also come with Gemini Live on them. Can you tell us a little bit about what Gemini Live is and who gets to use it?

Julian Chokkattu: Yeah. Gemini Live is this new feature within Gemini. It’s rolling out now to some Android phones, and also you need to be a Gemini advanced subscriber. Then I think it’s also coming to iOS at some point in the coming weeks. But Gemini is now the default assistant on Pixel nine, which the difference is basically last year with the Pixel eight, it was still Google Assistant, and then when you turned it on, it prompted you to switch to Gemini.

Now it’s just Gemini. Google Assistant is tucked away somewhere. We don’t talk about Google Assistant, or at least Google doesn’t talk about Google Assistant. Gemini Live is a realtime conversation. Currently, with Gemini, you ask it something, it takes some time, it spits you back a response.

Live is a normal human conversation where you’re actually asking a question, it then responds to you fairly quickly, and then you can have a back and forth. You can interrupt it. You can change the topic. All the things that you could expect in a normal natural conversation is something that at least that’s what they’re going for with Gemini Live.

It works hands free. What that means is you could turn it on, put your phone in your pocket, walk around and just have this conversation with Gemini without having to stare at your phone. Now, even in their new Pixel Buds Pro 2, you can also do that interaction from the earbuds and it’ll do the same experience.

Idea is that you’ll just have free flowing conversations with Gemini in your car, on your run. I’m struggling to see some of how that works or how deeply some of these conversations are really happening, because at the end of the day, it’s still limited to… Can learn a lot and understand a lot, because you’re basically asking it anything and it’ll spew the world’s information at you.

But I don’t know. The whole conversation thing. I feel like I’m struggling to see a world where at present and how it works, I’m going to willingly just continue to talk and ask it. I feel like it’s going to be siloed to specific moments, but the way that Google presented it almost was just like, “You’re going to talk to Gemini all the time.”

Michael Calore: Yeah. I think we’re all waiting for that world to materialize where we talk to these things.

Lauren Goode: Wait, wait, wait. I have to ask, what happened to Google Assistant? Has that gone the way of Duo or Wave, or what were the other ones? What was the Plus? Google Plus?

Michael Calore: Still there.

Lauren Goode: Is it?

Michael Calore: It’s locked away in the attic.

Julian Chokkattu: I have some answers for that one.

Lauren Goode: Yeah. Are these just going to merch?

Julian Chokkattu: It’s weird, because they keep saying that there’s some functions that Gemini can’t do that Google Assistant can do, like a lot of smart home stuff. For that, Google Assistant still exists and will be doing that, but over time they keep saying this. Over time Gemini will be able to do all the things that Google Assistant can do, and then no one seems to know what happens after that.

But weirdly, last week when they had the Nest announcement with the new Nest learning thermostat and the new Google TV streamer, they then said that, “Hey, Google Assistant is getting an upgrade with Gemini LLMs.” Now, Google Assistant has a different more natural voice and it has Gemini powering it.

Lauren Goode: Mike’s phone just blew up, by the way.

Michael Calore: It thinks we’re talking about it. Sorry, Google Assistant.

Julian Chokkattu: They were trying to say that in the home you’re going to want to use Google Assistant, and Google Assistant is going to do all of your smart home stuff. Then everywhere else, your personal details, your personal access and Gmail and all that, that’s going to be Gemini.

Lauren Goode: I’m going to bring it back to Condé Nast again. This is like in The Devil Wears Prada where there are two assistants, and one knows where to get the Hermes scarfs from and how to get Steven Meisel on the phone. Then there’s the assistant who’s like, “Just call the restaurant and make the reservation. Call the restaurant with the good reviews and make the reservation.” There’s that level, but eventually they become one superpowered assistant. Is that the idea?

Julian Chokkattu: It’s definitely going to just be one. They just don’t want to say it publicly, I think. It’s just they know their history of killing things, I think, and they’re just afraid of like, “Yep. We’re killing Assistant. Because also, I do think it does power a lot of smart home stuff, and doing that would break a lot of people’s bought and paid for smart home products.

I think there’s a lot more there that don’t want to rush, and so there’s going to be this long transitionary period. But I do think eventually it’s just going to be Gemini, unless they come up with a new name two years from now.

Lauren Goode: OpenAI just announced, and Google definitely will, there’ll be at least four names for this. OpenAI just released their voice bot in alpha mode, which our colleague Reese has been using. This was something that they previewed a few months ago, and then there was some response to it about how it was flirty and too human like.

It sounded like Scarlett Johansson, and Scarlett Johansson sent essentially a cease and desist letter, and so they had to slow their role a little bit. Now it’s out. Apple in June at WWDC also announced Apple Intelligence, which has a few different ways of working, but it beefs up Siri, too, is my understanding.

Michael Calore: Yeah. Makes Siri more real, conversational.

Lauren Goode: I guess the question is how much do we actually want to talk to these assistants? I think some people do. I personally am not really using the voice mode much. Then second, how careful are they being with our data? Because my understanding is Apple, for example, put out in a paper that they, “Do not use our users’ private personal data or user interactions when training our foundation models.”

Google has slivers of that, because they talked yesterday about the Gemini Nano model that is running on devices, on the Pixel device that keeps everything private. But then broadly Google is using things like your queries, your location, your feedback to provide, improve, and develop Google products and services and machine learning technologies.

Then they call it the Gemini apps activity. It is on by default if you’re 18 or older. Users under 18 can choose to turn it on, which is just really interesting backwards wording. It just means if you’re a grownup using Google Gemini, by default your data is being used to train their models.

Julian Chokkattu: I think the big difference this time around about talking to assistants compared to the old age of Siri. I nearly said Cortana for some reason. But compared to them, I think the main thing here is that they’re fusing assistants with all of their home brew services.

For example, Gemini Live will soon in the coming weeks get access to Gmail, Google Keep, Google Tasks Calendar. You’ll be able to do more where you can just naturally say, “Hey, what was that invite I got from someone two weeks ago?” Then it’ll just find that email, pull that information, and you won’t have to do anything. You’ll just say, “Yeah. Just put that on my calendar or remind me to buy a gift for them or something.” We’ve never really had a world where we could really do that and it all always worked.

Michael Calore: Yeah. I don’t trust it.

Julian Chokkattu: Right. There’s always broken functionality. It didn’t add it properly or doesn’t connect to that service that you use. Apple is going to be doing this, too. That’s their big thing with Apple Intelligence is it’s going to look into your emails and find that information.

It’ll all be personalized to your experience. That’s where I think it’s hard to say how much people are going to talk now, because it was just broken before and kind of haphazard. Now, it might actually work, but again, we’ll see, because I don’t know.

Michael Calore: Well, I think that stuff is interesting. The assistant type stuff is interesting where you can ask it to not just search the web, but search all of your personal information to find things for you when you need them. But why then are all the demos like, “Hey, look at this great conversation you can have like a human to this computer?”

Lauren Goode: Because this is the world’s largest beta experiment for technology. Truly. There was some of that in the 2010s, too, when we all started using apps and then using devices that had GPS built into them, and so our location data was being used for ad targeting and stuff like that. But all of this is ultimately serving, these free products are ultimately serving these tech companies who are making their models better so they can effectively target us better.

Julian Chokkattu: Well, the interesting thing is they’re not really free anymore. Even Gemini Live, now you have to pay for it.

Lauren Goode: Right. Well, they have to find some way to compensate for the incredible compute expense it takes to run generative AI models. Because Google, quarter to quarter we’re very worried it might go under. OpenAI, for example. They’ve been building up their enterprise business, because they have to find some way to bring in revenue.

Julian Chokkattu: Yeah. But at the end of the day, that’s, I think, the most personal. That’s the most useful thing. Rick Osterloh, the person who handles everything hardware at Google, basically started out the keynote yesterday saying, “We want to make AI and show you how AI is going to make everything more helpful to you.”

A lot of those examples were just things I don’t think most normal people do or ask AI. Maybe some people do it here and there, but it’s the stuff where you’re actually using it to connect with your own services and rifle through your information that is already just annoying and hard to access through the apps that you already use.

That’s the stuff that I would love to see more demos of and publicly showing that, because that’s the stuff that I feel like I would use it for that. Because currently, have you ever tried to search for an email in Gmail? It’s bad. It’s a really bad experience. If Gemini can do it much better in an instant way, and I can also then tell it to do something else with it, I would much prefer that than what the current status is.

Lauren Goode: Google has been hammering that helpful line for a long time now. Just saying. Rick, if he wants to be really helpful, can he please figure out my check engine light on my car or something? I don’t know. OK, but helpful. Sure.

Michael Calore: Yeah. I’m looking forward to the day when my phone can do absolutely everything for me without me even asking. It just knows that somebody has asked me if I can join a call and it just adds it to my calendar and tells me about it.

Lauren Goode: It’s a fine line between just pure optimization in the workplace and productivity and what is actually helpful IRL. Some of that is. What you’re describing is, getting to an appointment.

Michael Calore: If you’re willing to pay for it and you’re one of those people who has a new Pixel device, then you too can enjoy this version of the future, Lauren.

Lauren Goode: I think Google said they’re going to give me a loaner, but I didn’t go to the event in person yesterday, so I didn’t get my loaner.

Michael Calore: I’m sure you’ll get one.

Lauren Goode: Yes, I would like to try it. I do love a good Pixel.

Michael Calore: I look forward to talking about all of this again after the iPhone announcement next month.

Lauren Goode: Oh, I thought you were going to say after Julian’s review comes out, which everyone should be looking to read.

Michael Calore: That too. But yeah. I’m sure Apple Intelligence is going to go right down this same highway.

Julian Chokkattu: Although that is technically coming a little later. I think it’s supposed to maybe come in October probably, or November, so not as fresh off the heels of the iPhone launch.

Michael Calore: Yeah. Or the first day of CES, which would be very Apple.

Julian Chokkattu: That would be unfortunate. Very unfortunate.

Michael Calore: Let’s take a break and we’ll come right back with recommendations.

[Break]

Michael Calore: All right, this is the last part of our show where we go around the room and everybody recommends a thing that our listeners might enjoy. It can be anything. I see Julian, I see your wheels turning. Please tell us what is your recommendation?

Julian Chokkattu: I didn’t come as prepared as I feel like I usually am, but I’m going to make something that’s maybe a little too on the nose, but I feel like folding flip phones this year are something that … I met up with my cousin for the first time in a long time, and this is when I was testing Samsung’s new Galaxy Z Flip 6.

Michael Calore: To be clear, this is like a vertical flip, not a book style.

Julian Chokkattu: Not a book style. She’s in high school. I think she’s about to graduate, and we both pulled out our phones and I was using the Samsung, the new one, and she was using the prior generation one. I was like, “What is going on? How are you not on an iPhone?” Because that’s just what most teens in my experience are using.

The fact that I know several people. My mom loves using a Razer. They’re just getting to a point where I think it’s totally normal, and I’ve seen plenty of these in New York, too. But I think it’s just the quality of folding flip phones has improved to a point where it’s not so much of an issue in terms of durability and making sure the screen is not going to be a problem and all that.

It’s pretty much your normal phone experience, but you can fold it in half and it’ll fit in a pocket. Samsung did increase the price this year, but Motorola has pretty good options for under 700 bucks, for example. I just think people should just explore their options a bit, because there’s some cool stuff happening with phones, and I feel like I’m always writing and reviewing all sorts of phones. But people don’t still necessarily consider the folds, but I think this might be the year where I feel comfortable, the masses maybe look into it. I don’t know.

Michael Calore: Really? Really?

Julian Chokkattu: I especially love the fact that you don’t need to use a feature like Add Me if you’re in front of the Eiffel Tower and you want to take a picture, because you can just flip the screen, use the preview on the external display, and you can just use your hand or say take the photo or whatever, and it’ll take the picture of you and your partner or whatever in front of the landmark.

Michael Calore: OK. There’s one argument for.

Julian Chokkattu: Just the one.

Michael Calore: This is chaos. I understand the appeal of a book style folding phone, because then when you unfold it, you get something that is the size of a tablet, basically. There’s a lot you can do with that. But the flip style, the clam shell style, the Razer style flip phones I don’t get, because when they’re folded, they’re just crazy thick.

Julian Chokkattu: They’re not too thick nowadays. Also, I think they’ve been trying to slim them down, but they still take up way less room in for example, my pant pocket. Just generally, in my experience, even walking around with it in your hand, it’s so much nicer, because you can completely wrap your palm around it when it’s closed.

Then they’ve been making these external screens larger and larger every year, so you can do more on that screen so it doesn’t feel like you have to always open the phone up. You can look at your messages, you can look at some apps, even widgets and all that kind of stuff.

There’s more utility in that mode, which just makes it more, it’s fine to keep it closed more often. Then it’s just kind of nice. You can even access Gemini, for example, on the Razer on the external screen. You don’t need to open the phone up anymore.

Lauren Goode: I think there’s one barrier for me. It’s going to sound real basic, but stretch pants. I buy stretch pants with pockets and I use them for exercise or just to go run errands. The flat phone, what are we calling it now? Just the standard.

Michael Calore: Normy. Normal phone.

Lauren Goode: The normy phone.

Julian Chokkattu: The analog phone.

Lauren Goode: It fits pretty well into those side pockets. The flip phone when it’s folded is chunky, and then what, you’re going to put in your side pockets. It’s just going to be sticking outside of your leg like a lump.

Michael Calore: I guess what he’s saying is they’re getting less lumpy.

Julian Chokkattu: More experimentation required, I think, for particular types of pants. I do have one pair of pants that I hate, but I have them and they have abnormally tiny pockets, and the flip does fit pretty well compared to actually my normal phone, because it’s just such a shallow pocket, I guess. But it does work for that one, but obviously I’m not so used to no pockets or small pockets.

Michael Calore: Right. OK. Flip phones for the shallow pocketed gentlemen and for everybody.

Lauren Goode: Jason Kehe came on the show months back and made an argument for flip phones, because he said that his was so cumbersome to use that it kept him away from his phone. It created a necessary friction or barrier that he liked.

Michael Calore: Yeah. I don’t know. I don’t know.

Lauren Goode: That guy.

Michael Calore: This whole conversation is just making me sweaty and anxious and I feel like we should move on, so let’s do that. Lauren, what is your recommendation?

Lauren Goode: This is my dive into beauty influencing. This is my backup career when this whole tech reporting thing doesn’t work out any longer. I have found a tinted mineral sunscreen that I love. It’s called Colorescience Face Shield Flex. I have to warn everyone, it’s expensive.

Michael Calore: How expensive is expensive?

Lauren Goode: It’s around $50 a bottle. If you’re not a person who’s into beauty products and doesn’t spend a lot of time going down the K-beauty rabbit hole on TikTok or Reddit threads or YouTube get-ready-with-me videos, that’s a sticker shock. But if you follow this world, you understand that people spend a lot of money on these beauty products.

Also, I will say it lasts a pretty long time. I think I’ve only had to buy two bottles of this this year so far. Now, I know people also have strong opinions on European sunscreens and how they’re typically better for you. Also, there are people who love their Korean sunscreens, and I have not had the opportunity yet to dive into those, but so far I do love the Colorescience one. It’s a tint. It comes in four shades. I can’t vouch for how well it works on the shade that’s called deep. That’s the darker skin tone. But so far I have found that the fair and medium work well for me in different seasons. If you’re looking for a new sunscreen and you’re looking for something that has a tint to it, not a thick as foundation, but not just sheer or pasty, then I recommend checking out Colorescience.

Julian Chokkattu: I didn’t even know you could get sunscreen that was tinted like that.

Lauren Goode: You can, as a matter of fact.

Julian Chokkattu: Good to know.

Michael Calore: You got to ask a teen, man. If you want to know anything at all about skincare, just ask a teen.

Lauren Goode: It’s so wild. Jia Tolentino for the New Yorker just did a big story the other day on Sephora.

Michael Calore: Oh, yeah. The teens.

Lauren Goode: All the teens and the tweens love Sephora. The days of going to a prohibitively expensive department store counter and having a person guide you towards the potion, done. You just go into Sephora. There are people there to help you, of course, but you just go from station to station, you try all the things, glop it on your face. The teens, they know what they’re doing. It’s a bit much. It’s a bit much. It’s a lot of consumerism. They probably don’t need it.

Michael Calore: No, certainly not. But everybody needs this tinted sunscreen. This $50 tinted sunscreen.

Lauren Goode: But sunscreen is good. Wear sunscreen, people, and if you happen to be looking for a tinted one, Colorescience.

Michael Calore: Hit us with the impossibly long name again.

Lauren Goode: It is called, hold, please, the Colorescience Face Shield Flex SPF 50.

Michael Calore: OK. Excellent. 50 is for the $50 that it costs?

Lauren Goode: Yeah. Not including tax and shipping. Mike, what’s yours?

Michael Calore: I’m going to recommend an audiobook. All Fours by Miranda July. Now, this book is very, very popular. It’s literally a sensation. Out of stock at all the bookstores. There’s a 16-week wait at my local library for the Libby version of it.

However, if you are a Spotify Premium subscriber, you can listen to the audiobook as part of your Spotify Premium subscription, and it is read by the author Miranda July. All Fours is a story about a woman who’s in her mid-forties who is vaguely dissatisfied with her life and decides to go on a road trip, which takes some really fun and interesting turns.

It’s a great book. It’s a lot of fun. I’m almost done with it. I’ve been devouring it on Spotify as an audiobook, so I can definitely recommend it. If you are curious about this book, I’m sure you’ve heard of it if you’re a person who follows BookTok or reads any kind of mainstream media, because it’s just been getting all kinds of rave reviews and it’s probably going to be the biggest seller of the year or something.

Lauren Goode: I tried to buy it at a bookstore here in San Francisco, and they were out of it. They were so nice and they ordered it for me. But our friend Zoe shipped me a copy in the meantime, so I’ve got it at home. I just have to finish this other book first, because I’m big into finish the thing that you started with books for the most part. I haven’t cracked it open yet, but I’m so excited.

Michael Calore: When I’m doing audiobooks, I don’t listen to any podcasts so that when I’m done with an audiobook, I get to go back and catch up on all my podcast listening.

Lauren Goode: Yeah. There’s something really satisfying about the Goodreads thing, too. Check me off. I did that. Updating your progress.

Michael Calore: See, it’s hilarious that we’re talking about reading books and it’s all like, “Oh, but you can get it on Spotify. Oh, you can get it as an ebook. Oh, you can check it off on Goodreads.”

Lauren Goode: Do you mark it off on Goodreads when you’ve listened to an audiobook?

Michael Calore: Yeah, of course. It’s reading. I don’t care what anybody says, it’s reading. I read it.

Lauren Goode: I think that if your brain processes the auditory version of it the way that you would reading it, then that counts. My brain doesn’t. I like to read and not listen to them for the most part.

Michael Calore: OK. Well, then this recommendation is not for you.

Lauren Goode: OK. That’s fine.

Michael Calore: Julian, I highly recommend All Fours.

Lauren Goode: But I have a hard copy.

Julian Chokkattu: I’ll either find it physically or find it on an audiobook.

Michael Calore: OK. Great.

Julian Chokkattu: I’m sadly not a Spotify Premium subscriber.

Michael Calore: Oh, OK. Well, more power to you for that, I guess. Thank you for joining us this week. Thanks for being here.

Julian Chokkattu: Thank you for having me.

Lauren Goode: That was awesome.

Julian Chokkattu: It’s always fun.

Michael Calore: Yeah, of course. Thanks to all of you for listening. If you have feedback, you can find all of us inside Gemini. Just check the show notes or ask your phone. Our producer is Boone Ashworth. We’ll be back next week with a new show, and until then, goodbye.

[Gadget Lab outro theme music plays]