For iPhone fans who take shooting photos seriously, the iPhone 14 Pro is the best iPhone to buy – and it looks like its successor, the iPhone 15 Pro, is going to be even better. What at first looks like a fairly minor upgrade could make a big difference to the quality of your pictures and to the efficiency of effects such as Portrait Mode.
According to well-connected industry analyst Ming-Chi Kuo, Apple is replacing the current suppliers of the Pro's LiDAR scanner and going with Sony instead. That's significant, because Sony's version is much more energy efficient than its rivals.
That means Apple could do one of two things: focus on energy efficiency, with the new system placing less drain on the battery; or push the sensor harder to get better results with the same amount of energy use. Either option could make a significant difference to your next iPhone.
What does the iPhone LiDAR sensor do?
The Pro models since the iPhone 12 Pro and Pro Max have included LiDAR sensors. It stands for Light Detection And Ranging, and it bounces invisible light off objects to work out where they are. Augmented reality apps can use it to model the space around you, and the camera app uses it to improve its focus accuracy and speed. It's also a key component of Portrait Mode.
I don't have any inside knowledge here but I suspect Apple is more interested in pushing its LiDAR performance than using the new sensor to eke out a bit more battery life.
First of all, Apple is focusing – no pun intended – on camera capabilities as a key selling point; you can't really see a slightly better GPU performance, but camera improvements are clear and have an obvious gee-whiz factor.
And secondly, there's AR. It's no secret that Apple thinks augmented reality is a key part of our phones' futures, and the iPhone will of course be working closely with the Apple AR/VR headset and its more affordable successors.