The new generation of the iPhone 12 is already a reality and it makes a big camera leap. The photographic improvements introduced in this new generation go beyond the renewal of the sensors. For the first time in the history of mobile phones, Apple has introduced a LiDAR scanner in two of its mobiles, the iPhone 12 Pro and the iPhone 12 Pro Max. This technology was already released with the latest iPad Pro launched by the company a few months ago. But what does this improvement really work? Will it help improve the results of images captured with the rear cameras?
- Apple launched iPhone 12: specs, price and release date
- iPhone 12 will not have a 120Hz screen, and here’s why
- iPhone 12 mini is here: specs, price and release date
What is LiDAR and how does it work?
In purely linguistic terms, LiDAR is nothing more than the abbreviation for Laser Imaging Detection and Ranging, which in English means something like ‘Laser Imaging Detection and Ranging System’. Although Apple has told this technology is ‘innovative’, the truth is that LiDAR systems have been presented in the market for several years, although its application in mobile technology has not been effective until today.
Specifically, this system is used especially in tasks where it is necessary to know the orography of a certain surface, either to know the relief of a geographical area or to obtain a three-dimensional map of an object to be analyzed or modified in CAD design programs. It is also the system used in speed radars to calculate the speed of a certain vehicle at a particular point.
As for the operation of LiDAR systems, the mechanics used is quite similar to a conventional camera. Through a series of laser beams, the sensor is able to capture the volumetry of objects and the environment in general. This way, the sensors can generate maps in three dimensions in real-time without depending on other complementary sensors.
One of the great improvements that the new generation brings to the camera of the iPhone 12 Pro and 12 Pro Max is the ability to optimize the Portrait mode in night scenes. This improvement is directly related to the incorporation of the LiDAR sensor. What this sensor does is, it calculates the distance and volume of people in dark environments to improve the final image result when separating the body from people and objects from the rest of the environment.
Apple has also confirmed that the new iPhone is capable of focusing up to 6 times faster than the rest of the iPhone, so the improvements also apply to body detail at night. It is at this point where the algorithms of computer photography developed by the American (Deep Fusion and Neural Engine) enter.
The last feature pointed out by Apple is the performance of the Night mode, it is compatible with the sensor thanks to wide-angle lens. With all those features iPhone 12 makes a big camera leap.