Theis almost certainly just around the corner and the mill is churning furiously as we head for the start. I was very loud but it’s a camera I’m particularly happy to see make some real strides forward.
Cameras on Apple phones have always been great, withit can take the kind of shots you’d expect from professional cameras, and even its cheapest iPhone SE can take beautiful pictures on your summer vacation. But and packing amazing camera systems that mean Apple no longer has the dominance it once did.
So I sat and dreamed about how I would redesign Apple’s camera system for the iPhone 14 to hopefully secure its position as the best camera phone around. Apple, take note.
A much larger image sensor on the iPhone 14
The image sensors inside the phones are small compared to those found in professional cameras like the Canon EOS R5. The smaller the image sensor, the less light can hit it and the light is all in the photo. More light captured means better-looking pictures, especially at night, which is why professional cameras have sensors many times larger than those found in phones.
Why are phone cameras lacking in this regard? Because image sensors have to fit into pocket phone bodies where space is at a premium. But there is certainly some room to play. Phones likeand even the 2015 Panasonic CM1’s 1-inch camera, which can offer significantly better dynamic range and low-light flexibility, so it’s not too wild to hope for a much larger image sensor inside the iPhone 14.
Sure, Apple does amazing things with its computational photography to squeeze every inch of quality out of its tiny sensors, but if it combined those same software skills with a huge image sensor, the difference could be huge. A 1-inch image sensor certainly can’t be ruled out, but I’d really like to see Apple push things even further with an APS-C size sensor like those found in many mirrorless cameras.
Okay, not all three cameras could get massive sensors, otherwise they just wouldn’t fit in the phone, but maybe only the main one could get a size upgrade. Either that, or just have one massive image sensor and put the lenses on the rotary dial on the back so you can physically change the angle of view depending on the scene. I’ll be honest, it doesn’t sound like an Apple thing.
Zoom to finally match Samsung
While I generally find that images taken with the iPhone 13 Pro’s main camera look better than those taken with the Galaxy S22 Ultra, there is one area where the Samsung wins; telephoto zoom lens. The iPhone’s optical zoom reaches 3.5x, but the S22 Ultra offers up to 10x optical zoom.
And the difference it makes in the shots you can get is astounding. I love zoom lenses because they allow you to find all kinds of hidden compositions in a scene instead of just using a wide angle lens and capturing everything in front of you. I find that they allow for more artistic, thoughtful images, and while the iPhone’s zoom helps to get those compositions to some extent, it’s no match for the S22 Ultra.
So the phone needs a proper zoom lens that relies on good optics, not just digital cropping and refocusing, which always results in pretty muddy shots. It should have at least two levels of optical zoom; 5x for portraits and 10x for more detailed landscapes. Or better yet, it will allow you to continuously zoom between these levels to find the perfect composition, rather than simply choosing between two fixed zoom options.
Personally, I think 10x is the maximum Apple would have to go for. Sure, Samsung actually brags that its phone can zoom up to 100x, but the reality is that those shots rely heavily on digital cropping, and the results are terrible. 10x is huge and is the equivalent of carrying a 24-240mm lens for your DSLR – wide enough for sweeping landscapes, with enough zoom for wildlife photography too. Ideal.
Pro video controls built into the default camera app
By introducing ProRes video on the iPhone 13 Pro, Apple has given a strong signal that it sees its phones as a truly useful video tool for professional creatives. ProRes is a video codec that captures massive amounts of data and allows for greater control over editing in post-production software such as Adobe Premiere Pro.
But the camera app itself is still pretty basic, with video settings mostly limited to turning ProRes on or off, switching zoom lenses, and changing resolution. And that’s kind of the point; the easiest way to film and take beautiful shots without worries. But professionals looking to use ProRes will likely want more manual control over things like white balance, focus, and shutter speed.
And yes, that’s why there are apps like Filmic Pro that give you incredible fine control over all of these settings to get exactly the look you want. But it would be nice if Apple found a way to make these settings available in the default camera app. That way, you can turn on your camera from the lock screen, make a few settings, and get started right away, confident that you’ll get exactly what you wanted out of your video.
Focus stacking in the iPhone camera
Imagine finding a beautiful mountain wildflower with a towering alpine peak behind it. You zoom in on a flower and tap it to focus, and it comes into sharp focus. But now the mountain is out of focus and tapping on it means the flower is now blurred. This is a common problem when trying to focus on two items in a scene that are far apart, and experienced landscape and macro photographers work around this with a technique called focus stacking.
Focus stacking means taking a series of pictures with a steady camera and focusing on different elements in the scene. These images are then later blended together – usually in desktop software such as Adobe Photoshop or specialized focusing software such as Helicon Focus – to create an image that focuses on the extreme foreground and background. It’s the opposite goal of the camera’s Portrait mode, which deliberately tries to blur the background around the subject for that subtle shallow depth of field – or ‘bokeh’.
It might be wishful thinking, but I’d love to see this focus stacking capability built into the iPhone, and maybe it wouldn’t be that difficult. After all, the phone already uses frame blending technology to combine different exposures into a single HDR shot – it would just do the same thing, just with focus points rather than exposure.
Much better long exposure photography
Apple has had the ability to take long exposure photos on the iPhone for several years. You’ve seen the footage; pictures of waterfalls or rivers where the water has been artificially blurred but the rocks and landscape around the water remain sharp. It’s a great technique to really bring out the movement in a scene, and it’s something I like to do on my proper camera and on my iPhone.
And while it’s easy on the iPhone, the results are only OK. The problem is that the iPhone uses a moving image – Live Photo – to detect movement in the scene and then digitally blurs it, which usually means that none motion is blurry, even bits that shouldn’t be. This results in images that look quite mushy, even when you place the phone on a mobile tripod for stability. They’re fine to send to your family or maybe post on Instagram, but they won’t look good printed and framed on your wall, which I think is a shame.
I’d like to see Apple make better use of its optical image stabilization to allow for really sharp long-exposure photos, not just from water, but also night scenes, like car headlights zigzagging down the street. It would be another great way to get creative with your phone photography and take advantage of the excellent quality of these cameras.