It’s time for Apple to re-focus on the iPhone camera
Camera quality has been a top priority in every iPhone release since the iPhone 4 (at least). Each new iPhone has brought big improvements in image quality, performance, and features. The latest iPhone seems to always take the crown in our Last Cam Standing series, at least for a time. And while Android phones sometimes deliver slightly better photos or videos, you can count on Apple’s latest to always hold a position in the top three.
Maybe that’s not enough anymore. Camera quality is the most important consideration for smartphone buyers (perhaps outside of battery life), and it’s the easiest feature to show off to your friends. It’s a real phone-seller. Today, it’s not enough just to simply take really great photos and videos. Apple needs to “wow” us with brand new features and a completely rethought camera interface.
Make room for Pros
The current iPhone camera interface is positively ancient. It works well enough for the simple “take the phone out of your pocket and take a quick snap” scenario, but it has struggled to keep pace with the increasing demands we put on our increasingly capable iPhone cameras.
Apple’s basic interface—a large shutter button, swipe left or right for different modes, additional selections at the top—is simple and intuitive enough. But it’s also poorly designed for the expansive capabilities of the camera. The left/right modal selection doesn’t show all the modes; panoramic and time-lapse are cut off. It also doesn’t rotate along with the camera, so if you turn your phone into landscape orientation, the mode-selection text is on its side.
The additional info buttons at the top are a mess. How is anyone supposed to know that a bunch of concentric circles means “live photo?” Why is there no live photo selection for square photos? Why do I have to dig into the Settings app to change video resolution and frame rate instead of just tapping on the label in the Camera app?
It’s time for a re-thinking of the entire photo and video taking experience on the iPhone. One that remains simple, but is clearer about what features and modes are available. One that puts all the relevant options in the app, instead of burying half of them in the Settings app. A design that is extensible by its very nature, so Apple can experiment with new and creative modes and features without simply adding complexity.
Most importantly, the camera app needs optional settings for photography enthusiasts who know what they’re doing and don’t always want to shoot in Auto mode. Let us set specific shutter speeds, white balance, and ISO settings. Give us a live histogram, focus lock, and focus peaking. Naturally, Apple wouldn’t want to clutter up the default interface with all this stuff, but such options should be available in a user-selectable “pro” mode.
The wow factor
Apple seems content to improve its camera in very traditional ways: improving dynamic range, sharpness, and color in a variety of modes that mostly mimic what point-and-shoot cameras do. That’s all very important, but it doesn’t excite customers like it did a few years ago. It doesn’t have that “wow factor.”
Just look at how “wowed” everyone has been with Google’s Night Sight mode. That’s the sort of nifty trick that, while limited in scope, makes people’s jaws drop. Had it landed on a phone as popular as the iPhone instead of the comparatively rare Pixel 3, we would have seen #nightsight as a global trend. If Apple did it first, it would have become one of those iPhone-defining characteristics that wins new converts.
I think Apple should copy Night Sight with its own (superior!) super-low-light mode, but that’s not really the most important correction to make. The problem, as I see it, is that Apple doesn’t prioritize creative new uses of modern smartphone photography. Whether it’s Night Sight or the Huawei P30 Pro’s crazy super zoom, Apple should be the out in front, not playing catchup.
I want iPhone users to open up the camera app and say, “Wow!” Apple seems content with “Wow, what a good photo,” but I want to hear “Wow, I didn’t know phones could do that!”
Here’s an example: Nvidia has done some great research into using machine learning to produce intermediate frames in video, creating smooth slo-mo from standard frame rates. Imagine taking the iPhone’s 4K 60fps video and letting the Neural Engine process the video for a few seconds, turning it into a 480fps slow-motion video. Or better yet, start with the iPhone’s current 240fps 1080p slow-motion mode and turn that into a crazy 1,920fps super-slo-mo video. That’s the kind of wow factor the iPhone camera needs.
A change in attitude
Apple clearly believes that mobile photography is extremely important. The company devotes serious resources toward improving it at every stage of the pipeline, from sensors and optics to image processing, machine learning, and even file formats and encoding.
It would be foolish to somehow imply that Apple needs to take iPhone photography more seriously. Such a thing would hardly be possible. Rather, I’m suggesting that Apple’s attitude toward iPhone photography needs to evolve, its vision expand.
It is as if Apple holds a high-end DSLR as the gold standard for taking photos and videos, and just a couple years ago that would have been true. Today, advances in sensors, machine learning, and the smartphone processing power has allowed them to, in some ways, supersede what is possible with a great traditional camera. Features like Google’s Top Shot (which automatically takes additional photos and uses AI to suggest a better shot than the one you took) are perfect examples of thinking beyond simply improving the quality of photos and videos. It’s about creatively using the intelligence of our little pocket supercomputers to take different kinds of pictures, to improve the entire photo-taking process, and to have more fun doing it.