iPhone 14 Pro 3:2 Leak = Film Simulation as Next-Gen Photography Styles

The recent iOS 16 Beta 2 release indicates that Apple has tested a 3:2 aspect ratio stills mode for the Camera app. This setting is not exposed to the user and has not been listed as an official iOS 16 feature (as far as I can tell), so either Apple will introduce it quietly in a later update or they are working on something for the iPhone 14. Note that the current default aspect ratio of iPhone shots is 4:3.

I’ve long called for the next generation of photographic styles to move beyond simple contrast and saturation/warmth adjustment into the realm of film simulation profiles, much like Fujifilm digital cameras do today. Film Simulation is much more advanced (and opinionated) than what Apple currently offers because Film Simulation also plays with artificial grain, fundamentally changing color tones rather than just saturation (Lightroom Color Calibration for example), the halo, etc. Artificial grain, for example, is not as simple as sticking a grain filter onto an image as many editing applications do today, because the actual grain of the film changes in appearance depending on the amount of light that the film receives in a given part of the image.

One of the commonly accepted features about the 14 Pro series is the move to a 48MP sensor after 7 years of 12MP sensors since the 6S in 2015, so there’s no doubt the camera will be (as usual) a massive goal this year. Professional model. Beyond camera hardware upgrades, Apple likes to introduce new camera software features to go along with hardware improvements…

So what does the 3:2 aspect ratio have to do with all of this? Well, that’s the aspect ratio of a 35mm filmstrippopularized by the Leica camera (and we all know how much Steve Jobs and Apple love Leica).

I think the software feature for this year’s Pro series will be the introduction of “AI-powered” film simulation profiles built right into the camera app that are integrated at capture time, much like photographic styles are today.

It will be “powered by AI” as they will say that the neural engine seeks to perfectly reproduce film signatures (such as the precise application of grain I mentioned) on the image using “raw data from the camera” as opposed to “dumb” flat filters which look janky. This will be their excuse for why older iPhones like the 13 Pro can’t have it, like what they said about photography styles.

I think Apple will continue to build customizable unique capture-time photography styles like this, not only because they look better than post-capture filters in third-party apps, but also because they change the annual camera comparison of “Here’s how the iPhone 14 photos look compared to the Pixel 7” in something less binary. You are no longer comparing one company’s unique algorithm to another, but the customer is creating their own camera by choosing the look they want everything to be. Photographic styles already do this, but film simulation takes it to another level.

Thoughts? Of course, there’s no guarantee they’ll and the movie simulation has long been on my list of “things I want from Apple that probably won’t happen”, but see the leaked report 3:2 aspect ratio instantly reminded me of film cameras.