The LiDAR sensor that Apple launched on the iPad Pro and after the iPhone 12 Pro and iPhone 12 Pro Max in 2020, as we have seen so far, allows us to achieve augmented reality effects. thanks to the ability to accurately place objects in 3D space. And while some apps have already taken advantage of it, it’s a technology that offers a lot more than we know and is ideal for the development of other devices.
As iPhone users, the LiDAR sensor application is noticeable in the camera quality of the new iPhone 12, and in very limited demonstrations that some applications allow us so far. But the contributions of this technology can be seen in virtual reality glasses
The possibilities of the LiDAR sensor in a new Apple device
Currently, augmented reality glasses like Magic Leap and HoloLens already pre-scan their surroundings before placing objects in them, and Apple’s LiDAR-equipped AR technology works the same. In this direction, IPhone 12 Pro and iPad Pro are like AR headphones without the glasses
The LiDAR sensor can be used to flatten 3D objects and parts and overlay photographic images on top, a technique called photogrammetry. This could be the next wave of capture technology for practical uses like home improvement or even social media and journalism. The ability to capture 3D data and share that information with others could turn these LIDAR-equipped phones and tablets into 3D content capture tools. LiDAR could also be used without the camera element to acquire measurements of objects and spaces.