For all the complaints about Google Pixel phones, like their prices, lack of matching specs to other phones, or design quirks , there’s still one area that leaves me scratching my head with every new year that passes. Google Pixel phones, including the Pixel 4, still don’t have a manual or pro camera mode. It’s so weird.
When Google announces a new Pixel phone they are guaranteed to spend a big chunk of time talking about the phone’s camera setup. This year, with the Pixel 4, they spent roughly 20 minutes of an hour-long presentation talking about the camera. Marc Levoy, the computational photography guru behind the Pixel camera, talked at length about all of the new tricks up the Pixel 4 camera’s sleeves before handing the stage off to Annie Leibovitz. The camera is what sells the Pixel phone line.
While Google added dual exposure controls over brightness and shadows that you can control in real-time, Live HDR+, and improved auto white balancing, only one of those items is really a manual control for someone. The rest of the Google Pixel camera experience is left up to Google and its computational photography sorcery. Sure, Google’s camera smarts are arguably the best in the business, but what if you don’t always want Google to be in full control? What if Google gets it wrong?
For example, white balance can be a tough area to get right, specifically in an auto mode. The Pixel 4 introduces an improvement here that was called out on stage, but again, you are trusting Google to get it right. In the example that Levoy showed during his presentation, he talked about how snow may present itself as blue because of the sky above it, while we know that snow is white. The computational photography of the Pixel 4 should learn to correct that, but wouldn’t a manual control over that bring more certainty to those who want it?
You’ve probably noticed that whenever Tim reviews a phone that isn’t a Pixel, he often talks about how much fun he’s had shooting long exposure photography. He doesn’t ever bring it up with Pixel phones, because they lack the controls (shutter speed) to do that. You could potentially shoot somewhat of a long exposure shot using Night Sight, but those abilities would be limited to darker times of the day.
I’m not a pro photographer and mostly shoot in auto, but even I’m a fan of pulling a manual focus from time to time. Tap-to-focus is great in the majority of situations, until you really want to lock in on a subject, an edge, a flower, etc. Tapping and hoping that the camera responds to the precise spot you need focus can be frustrating. With an on-screen manual focus wheel, and focus peaking like Samsung’s pro mode has, that’s a control you can’t beat.
Google’s phones can’t do any of that in the stock camera app. Of course, you could install a third party camera app, which if I’m being honest, only introduces more uncertainty. We tend to trust the default camera apps on the phones we buy as the best tuned because they are from the company who put the camera in the phone and prepared it for launch. It should give us the best results in the end.
And remember, Samsung, OnePlus, and LG all have manual camera modes. Huawei does too. So does Motorola. Google is about the only company on Android not doing it.
Why is that? It’s probably like their 4K/60fps mindset, where they simply think you don’t need a pro or manual mode. Their computational photography is so good that you don’t need to manually control anything because Google will get you better results. Maybe in most situations they will, but let’s not forget that Google is big on being a camera leader who often finds itself pro photographers to show off its cameras. You can’t tell me those pro photographers wouldn’t mind seeing how good Google’s cameras are if they had that full control.