iPhone 12 Pro, Apple and LiDAR in the enterprise

Both the iPhone 12 Pro and 12 Pro Max possess Light Detection and Ranging (LiDAR) scanners. Why should it matter to your enterprise?

What is LiDAR?

LiDAR systems are capable of measuring how long it takes light to reflect back from a surface in order to measure the distance traveled and create 3D depth maps of what they see. The technology helps support realistic AR experiences and provides a huge boost to photography in low-light.

LiDAR is not a new technology. It relies on tiny lasers that bounce light pulses against objects around them to figure out spatial information based on the journey it takes to send and return that light. Most people first became aware of it when it was used to help Apollo 15 map the moon’s surface. (NASA is now developing the technology for Mars missions.)

LiDAR has seen multiple uses since — it’s being used to inform collision detection systems in autonomous and semi-autonomous vehicles, and it’s expected to be deployed in all new cars sold (smart or otherwise) after 2030. It is also used in some smart vacuum cleaners (and by security researchers, who have figured out how to use these systems to eavesdrop in homes).

Apple and LiDAR

Vehicles and robot cleaners use LiDAR to sense objects around them and to prevent collisions, but the technology has other uses on Apple’s smartphones.

It’s important to note that Apple’s LiDAR implementation relies on a system that sends out multiple beams of light; cheaper systems use just one. With support from the A14 Bionic chip, Apple’s implementation is faster, more accurate, and has a longer range. It also means the devices can successfully figure out how to place virtual objects in complex scenes using object occlusion.

Copyright © 2020 IDG Communications, Inc.

Source link