The world has more often than not argued about smartphone cameras as they’ve rapidly evolved over the past few years, but for many, an iPhone sets the benchmark for balance between optical performance and versatility. Filmmakers are now increasingly taking advantage of an iPhone form factor, third party videography and editing app ecosystem such as the Blackmagic app, as well as features including shooting at 4K120fps and a gimbal-competing stabilisation in the Action Mode. The Mumbai Academy of Moving Image (MAMI), showcased four short films as part of latest edition of MAMI Select: Filmed on iPhone, program.
In an exclusive conversation with HT, Kaiann Drance, who is Apple’s vice president of Worldwide iPhone Product Marketing, points to the conversations they have had with filmmakers, the importance of generational hardware improvements such as a latest quad-pixel sensor that underlines the new 48-megapixel Fusion Camera, and Apple’s approach towards photography and videography upgrades since they embraced on the computational photography journey with HDR fusion in an iPhone 4 (that was launched in 2010).
“Since iPhone was first introduced, it has become one of our most personal devices and is the world’s most popular camera. Not only has it enabled photographers, filmmakers, and creators to do what they do best, but made filmmaking and content creation more accessible. Watching how this accessibility has translated into an expansive breadth and range of iPhone films throughout the years has been incredible, and the films debuted here at MAMI are inspiring examples of what’s possible on iPhone,” says Drance.
For filmmakers such as Bagchi, the ecosystem helps. “We’re tracking bubbles and plastic sheets flying through the air, and the depth of field is so clean. Just like it’s shot on a huge, high-budget cinematic camera,” she says. Bagchi then points to graphically demanding workflows including overlaying the industry-standard Rec. 709 color space on ProRes Log footage captured on iPhone, which her M4 Max MacBook Pro handles with ease.
Drance points to how Rohin Raveendran Nair used the smaller profile of iPhone 16 Pro Max to place a camera inside of a typewriter to capture its point of view, something that would not have been possible with traditional camera equipment. She also talks about is Amrita Bagchi’s horror film as an example of using Cinematic mode to bring a new dimension to video storytelling. In Tinctoria, Bagchi creates a claustrophobic atmosphere for the film’s opening montage with Cinematic mode without the need for a large scale camera rig. “It’s truly inspiring to see iPhone continue to be a key tool for creatives,” says Drance.
Nair framed his point of view shots in a 4:3 aspect ratio, to replicate the perspective of a vertical sheet of paper. These shots are juxtaposed against wider 2:1 aspect ratio, when expansive backwater landscapes were captured. Nair uses a bloom filter to create a halo around the highlights.
Part of what makes iPhone so powerful is its seamless integration of new camera hardware, software, and Apple Silicon to go beyond what camera hardware alone can provide.
“These parts build upon one another to create something even better. A great example of this is 4K120 in Dolby Vision for creative flexibility. It required A18 Pro getting the ability to support 4K resolution with higher frame rates, including updates to the video encoder. It required our new 48MP Fusion camera system with a second-generation quad-pixel sensor that enables 2x faster sensor data readout speeds. And it required powerful software algorithms and an intuitive user interface to enable capturing and smooth editing in 4K120,” Drance explains the interconnection between hardware and software.
A ‘walled garden’ if you may, but one that allows Apple to ensure there are no weak links in the chain. “You can see beautiful examples of 4K120 in films like Shalini Vijayakumar’s film, Seeing Red, where she uses the feature to capture her female characters,” she says.
“Using the 5x Telephoto lens, I’m able to place the men in front as they discuss the fate of the women in the background. There’s so much storytelling in that one frame through that particular lens,” Vijayakumar explains.
That is when Drance points to an important and often overlooked part of the smartphone camera experience. The audio. What you hear. In the latest generation iPhones, studio-quality mics are paired with advanced algorithms, the basis being the A18 Pro chip, to deliver a feature called Audio Mix.
This is something HT previously detailed in the iPhone 16 series analysis. The In-Frame audio option is the ideal choice for videos shot in noisy environments (such as with wind noise), Studio sound would work best for recording interviews and general conversations, while Cinematic adds definitive depth, and a sense of place to the visuals being recorded. Each makes a perceptive difference — we had noted.
“Chanakya Vyas’ film Mangya is a great example of a film that utilises this combination to produce true to life audio despite difficult filming amongst chicken coops. The filmmaker was able to capture the sounds of footsteps, a rooster crowing, and even the whirring sound of a fan with clarity - a feat that would normally require multiple pieces of equipment,” Drance points out.
But as competition from the Android phone makers dials up, and the latter still betting big on higher pixel count as the basis for better photos and videos, how does Apple approach the balance between resolution, sensor size, and post-processing to deliver superior image quality?
“Our technological innovations across hardware and software ensure that everyone from studio production teams to everyday consumers can capture the best photos and videos with ease. iPhone 16 Pro and Pro Max are the industry chosen devices for professional photographers and filmmakers, and that’s a responsibility we take very seriously. Its 48MP (megapixel) Fusion camera uses a new second-generation quad-pixel sensor that enables 2x faster sensor data readout speeds, and supports 4K120 video,” Drance says.
“We’ve seen filmmakers create a wonderful dreamy effect by adjusting the playback speed from 4K120 fps to 60fps when subjects are in motion or even slow it down to a fifth speed corresponding to 24fps and mix with Cinematic mode shots,” she adds. Drance believes this deep integration of hardware and software allows users to capture more detail in wider-angle shots or for a beautiful up-close macro shot. “The results we’ve seen from filmmakers have been awe inspiring,” she summarises.
Key to Apple leveraging features such as Night Mode, Deep Fusion, or Cinematic Mode, has been the use of artificial intelligence (AI). Drance agrees, adding, “AI and Machine Learning are integral to virtually all the products we build. We pioneered computational photography in smartphones starting with HDR fusion in iPhone 4, followed by a big update with the multi-frame noise reduction, and we took a huge step forward with the Neural Engine in 2017. Since then we have been improving that capability with new hardware, software, and powerful silicon.”
There are examples of how this is in play. The Cinematic Mode uses machine learning to create videos with a focus on depth of field and subject tracking. The technology at play analyses the video feed in real-time, identifies subjects such as people or objects, and intelligently adjusts focus and depth effects to create a more cinematic look. Users have an option adjust focus points after capture (that is, at the edit stage) for even more creative control.
Then there is Action Mode uses machine learning to capture fast-moving subjects with improved clarity and detail for smoother videos. “It analyses the camera feed in real-time, identifies motion and subjects, and intelligently adjusts camera settings such as shutter speed, focus, and exposure. By doing so, it minimises motion blur and maintains sharp focus on the subject,” explains Drance. The Action Mode on the latest generation iPhones, replicate gimbal-esque stability.
“There’s no time to mount the camera on a traditional gimbal. But with Action mode, I could even shoot multiple takes. The stabilisation is impressive,” says Chanakya Vyas.
App developers play an even greater role for Apple, as the camera stakes get higher. “Developers are important to iPhone and the Apple ecosystem and we love seeing the work they do on our products and the custom controls they give to creators,” says Drance.
A popular third party app for filmmakers is Blackmagic Camera. With Camera Control, it enables swipe through custom LUTS to pick a preferred look, control Zoom, Exposure, Manual Focus, Zebra, and Focus peak. Also popular with filmmakers and users are apps Kino and Pro Camera. Apple is confident more app developers will be able to take advantage of the close-knit iPhone camera foundation.
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.