A laser rangefinder measures the depth of a scene by emitting light and calculating the time it takes for the light to reach an object and back to the sensor
Apple has equipped the iPhone 12 Pro and iPhone 12 Pro Max with laser rangefinders (lidars) so developers can take advantage of the new hardware to improve their apps.
That’s why the iPhone 12 Pro and iPhone 12 Pro Max need a lidar. Visual demonstration
Snapchat has announced a new version of Lens Studio, which allows designers to create custom lenses using the iPhone 12 Pro and iPhone 12 Pro Max laser rangefinders.
The new version of the Snapchat app was quickly showcased during Apple’s presentation. The company later revealed how the feature would work for users who own the iPhone 12 Pro. A laser rangefinder measures the depth of a scene by emitting light and calculating the time it takes for the light to reach an object and back to the sensor. At the same time, the lidar allows you to create a three-dimensional model in high resolution, which is rebuilt instantly when you rotate the camera.
This allows augmented reality applications to accurately display scene elements, as the iPhone knows exactly where each object is and what its actual dimensions are. Snapchat will use this technology to offer new unique lenses that will enable thousands of AR objects to be displayed in real-time using the iPhone 12 Pro’s camera.
The company has released Lens Studio 3.2, an update to its toolkit that allows designers to create new Snapchat lenses compatible with the laser rangefinder in the 2020 iPhone 12 Pro and iPad Pro.
By the way, lidar in Pro models is also used for autofocusing in low light conditions.