For Google’s Pixel Camera Team, It’s All About the Memories

“Did Real Tone get worse?” That was one of my criticisms of the Pixel 8A when I reviewed it a few months ago. Google’s midrange smartphone packs a wallop for $500, but Real Tone, the camera feature designed to produce more accurate skin tones, especially for people of color, was a bit of a mixed bag. My siblings and parents said as much when we compared selfies to competing phones in the same price bracket.

Florian Koenigsberger and his team have been acting on this kind of feedback to continually tweak image processing algorithms ever since Real Tone debuted in the Pixel 6 three years ago. He’s Google’s image equity lead and the person who spearheaded the project. I got a chance to peek behind the studio curtain to see the testing Google does to refine these imaging algorithms every year—Real Tone in the latest Pixel 9 series supposedly offers some of the most notable improvements across the board since its launch.

Properly exposing darker skin tones was not a priority for film camera systems from the mid to late 1900s. Kodak's Shirley Card featured a photo of a white woman and was the tool used to calibrate the colors for printing images across the US. While many of these issues have been addressed in the digital era, it's far from resolved. It's still not hard to find criticisms of how black skin tones appear in media, from magazine covers to feature films, often for lightening the skin tone. And remember when webcams on HP computers failed to track the facial features of a black man but had no trouble with a white woman?

It's why a feature like Real Tone hits home for Koenigsberger. “My dad is a white German, my mom is a very dark-skinned black Jamaican woman, and my brother and I are two different colors,” he says. “Trying to get that family photo has historically not been the easiest on any device, and I feel like every year I actually get to live the truth at home.”

Small Spaces

We met at a small photo studio on the second floor of a building in Brooklyn near McCarren Park. It's not the kind of space you'd expect to be bankrolled by one of the wealthiest companies on Earth. The studio is small, with seamless paper backdrops against one wall and a mountain of photo equipment on the other end of the room. Koenigsberger says his team put it together in 2020 during the pandemic and have held onto it since.

The smart lights in the studio are voice-controlled, and I spotted a Google Wi-Fi router in a corner—the team uses commands via a Nest Audio smart speaker to cycle through the varied lighting conditions it uses in testing. To darken the space and simulate nighttime settings, it's a bit more manual; Koenigsberger uses a step ladder to mount blackout curtains along the windows.

This setup lets the team take photos of subjects and use that data to improve the Real Tone algorithm baked into Google's Pixel phones in the hopes of delivering a more accurate representation of skin in every photo captured. And yes, while the feature launched to better serve darker skin tones, it has also helped improve the rendering of lighter skin tones in varied lighting.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Koenigsberger says his inspiration came directly from Aperture magazine's Vision & Justice issue in 2016, guest edited by Sarah Lewis, who teaches history of art and architecture and African and African American studies at Harvard University. The issue covers the contributions of African Americans to the histories of film and photography.

“In all of my education, I feel like I had never somehow been told or seen who these photographers were, what their contributions were, and seeing my community represented in the way that their eyes did," he says. "And so it begged this larger question: With my time at Google, what is something that only I can do while I'm here? What is the company well positioned to do with its expertise in computational photography, and where could we really look at like a flip-the-switch moment for a problem that has existed for decades and decades and decades."

Also in the studio was Cootchill Nelson, Google's image quality engineer and the photographer who leads many of these shoots, along with several models of varying ethnicities and skin tones. Nelson captures photos of these models with different colored backdrops and various lighting, and they often walk around the neighborhood to photograph scenes outside. This particular setup involved a Pixel 8 Pro and a Pixel 9 Pro mounted onto a single rig to capture the same photos and videos side by side. It's not unlike the way I test smartphone cameras in reviews.

But Google has a poor record when it comes to gathering data, Notably, after the launch of the Pixel 4 and its new face-scanning feature, the company came into hot water when a third-party contractor was caught targeting homeless people with darker skin for 3D facial scans, offering $5 gift cards and rushing them through consent forms without fully explaining what the data was being used for. Thankfully, that's not a practice in Koenigsberger’s studio.

“Everybody is a paid and notified, willing participant," he says. "We work with a couple of different agencies to do model recruitment and make sure we’re doing at or above market rate for that, and we've tried to work with people over time as well mostly because it's really impactful for the people who participate in the project to see the changes in the technology.”

Big Changes

Speaking of, I saw first-hand many of the Real Tone improvements in the new Pixel 9 series compared to its predecessor in the studio and on our photo walk around the neighborhood with Koenigsberger and his team. One of the most notable is photographing a backlit subject in front of a window because the Real Tone algorithm now prioritizes faces in photos and video, reducing huge swings in exposure, like when a dark-skinned person is in a bright setting.

Take the video example below. The Pixel 8 Pro tries to adjust the exposure for the whole scene, and it swings greatly, making the model appear super dark when pulling away from the window and brightening her skin when getting close up. On the Pixel 9 Pro, while there are still some adjustments in exposure happening, it's not as huge of a shift, and the model's facial features are more visible.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

Another clear example of this is when a model lifted a surgical mask to her face—her skin darkens more in the Pixel 8 Pro video, whereas there isn't as much of a swing on the Pixel 9 Pro. This also affects situations where you have multiple people of different skin tones together, and Koenigsberger says there should be fewer distortions in exposure.

Analyzing photos captured at the scene, it wasn't hard to discern the updates to the algorithm, especially with the luxury of having the models right in front of me. Even in normal lighting conditions, skin tones on the Pixel 9 Pro had a much closer match to the people in real life, in my eyes, over the Pixel 8 Pro. Koenigsberger says these are also due to broad changes in Google's HDR+ imaging pipeline (more on this later), which enables the system to produce more accurate shadows and mid-tones.

Another new change is also auto-white balance segmentation, and this process allows for separate auto-white balance exposures for people in the picture from their background. Before, you may have noticed some color seeping in from a setting, like blue skies producing a cooler tone to the skin. This new system helps “people stay the way that they should look, separate from the background,” Koenigsberger says.

This year's Pixel 9 series is also the first time that Google's skin tone classifier fully aligns with the Monk Skin Tone Scale, a 10-shade scale released to the public that represents a broad range of skin tones, meant to help with all kinds of uses from computational photography to health care. Koenigsberger says this change allows for much more fine-tuned color adjustments.

Arguably most important is the fact that Real Tone for the first time has been tested for all of Google's “Hero” features across the Pixel 9 range. Koenigsberger says his team has been able to scale up testing to ensure new features like Add Me have been tested for Real Tone before launch. That's important because Koenigsberger says his team isn't always able to spend as much time testing on the A-series Pixel phones, which might be why I had some issues with Real Tone on the Pixel 8A. Scaling this process up will hopefully help, but Koenigsberger says it brings Real Tone from a specific set of technologies into Google's operating philosophy.

“Ultimately, this is going to be someone's memory," Koenigsberger says. "It's going to be their experience with their family; it's going to be that trip with their best friend—as close as we can get to recreating those experiences when we're testing, I think the more reliably we're going to get people something that they're happy with.”

Artificial Memories

Memories are the underlying theme driving many of the new features from Google's camera team. Earlier in the day, I sat down with Isaac Reynolds, the group product manager for the Pixel Camera, which he's been a part of since 2015 with the launch of the first Pixel phone. Nearing his 10th anniversary, Reynolds says he's probably “more enthused than many others” about mobile photography, believing there's still so much space to advance cameras. “I see the memories people can't capture because of technical limitations.”

New camera features in Pixel phones increasingly focus on specific instances rather than broad strokes changes to the general camera experience, though Reynolds says the HDR+ pipeline has been rebuilt in the Pixel 9 series. It retunes the exposure, sharpening, contrast, and shadows merging—plus all of the updates to Real Tone—which help create a more “authentic” and “natural" image, according to Reynolds. He suggests it's what people prefer compared to the more processed, punchy, and heavily filtered images that were so popular a decade ago.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

All of those updates will impact your day-to-day photo and video needs, but what about the special moments? The hike up that mountain in Australia that you saved up hundreds of dollars for? That once-in-a-lifetime family photo at the Grand Canyon? That's where some of these new features come in.

Add Me, for example, lets you take a photo of your loved one in front of a subject, like the Eiffel Tower, and then swap places so that they can take a picture of you doing the same—the camera will superimpose the images to make it look like you stood next to each other naturally. No need to risk handing your phone to a stranger to take a pic. The entire Panorama mode in the Pixel 9 has been rebuilt too. It's not something you'll use often, but when you do, it's probably when you're at an important vista. That's why Reynolds says the team focused on bringing it up to speed with more recent camera advancements, utilizing the entire HDR+ Night Sight image processing algorithm for low-light improvements and better stitching quality.

Much of this work targets the “workflow” rather than the capture process itself. Reynolds points out that, currently, creative professionals have a workflow where they plan a shoot, set up cameras, capture the subject, then move the media to an editing platform, sort through the photos, and edit the images. The camera is a very small part of that.

“We are not just inserting ourselves into this narrow slot built for a camera," Reynolds says. "I would like to insert ourselves into this broader, longer sequence of creation people are increasingly going through because that's the problem to solve."

That brings us to the generative artificial intelligence features increasingly making their way into Google's Pixel phones. Last year, the company debuted Magic Editor, which lets you move subjects in the frame from one end of the photo to the other, and AI will fill in the area where the subject is no longer present by generating pixels. This year there are even more GenAI features, most notably one called Reimagine.

After you snap a photo, you can hop into Magic Editor and “Reimagine” your image. This lets you enter a text prompt to alter the image. Want to add a UFO in the sky? Just type it in and, after a minute, you'll get some options to choose from. Want to change the sky from day to night? Go ahead. The quality of the results can vary (and depends on how descriptive the prompt is), but it's opening the floodgates to change the photograph completely.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuGearHow Do You Solve a Problem Like Polestar?By Carlton ReidGearThe Best Hearing Aids We’ve Personally Tested and Vetted With an ExpertBy Christopher NullGearEverything Apple Announced TodayBy Boone Ashworth

To Reynolds and the broader Pixel Camera team, it's not necessarily the photo that's important, but your memory. If you want to fudge with the image by altering the color of the sky to be sunset or use the new Autoframe feature to recompose the image so that it's more striking, then that's OK.

“It's about what you're remembering," he says. “When you define a memory as that there is a fallibility to it: You could have a true and perfect representation of a moment that felt completely fake and completely wrong. What some of these edits do is help you create the moment that is the way you remember it, that's authentic to your memory and to the greater context, but maybe isn't authentic to a particular millisecond.”

Two years ago, I interviewed Ramesh Raskar, associate professor at MIT Media Lab, for a video on the advancements in smartphone photography, and he talked about how one day we’ll have camera phones that let you change a cloudy day in Paris to a sunny one. That day seems to pretty much be here. I reached back out to Raskar, who happened to be in Paris for the Olympics, and he remarked how he has barely seen anyone with a professional camera in the audiences (barring sports photographers, of course).

“One could argue that the best photos are the ones that don't capture what we see with our eyes,” Raskar says. “The photos that we like are extreme zoom, very shallow depth of field, macro photos, capturing motion in bizarre ways, light trails. Even in the world of photography we have moved away from photos that would have been seen by a natural eye. That evolution will continue, and photography will be less and less about what the eye can see.”

Raskar’s new prediction? A “non-camera.” No sensor, no lens, no flash, just a single button. All it does is capture your GPS location and time of day and you can say, “Hey, I’m with my family and friends” in front of the Eiffel Tower, and it will compose the picture since it knows the time, date, location, and weather.

As a photographer, that’s a terrifying future to imagine. If it’s any consolation, Reynolds doesn’t think photography is dead.

“I don’t think anything we have today is going anywhere. I do think we are going to make a lot of things a lot easier for folks, and that’s going to be fantastic,” he says. “Cause then they can spend doing other things, making more memories, instead of fighting to create the memories they thought they already had or didn’t match up to how they remembered it and now they’re frustrated and sad. I think we’re going to have a lot more happy folks making memories more easily.”

About Julian Chokkattu

Check Also

How to Preorder the PS5 Pro (Before a Scalper Bot Does)

We’re barely done with the years-long period where it was almost impossible to get your …

Leave a Reply